Nov 29 05:35:33 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 29 05:35:33 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 29 05:35:33 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:33 localhost kernel: BIOS-provided physical RAM map:
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 29 05:35:33 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 29 05:35:33 localhost kernel: NX (Execute Disable) protection: active
Nov 29 05:35:33 localhost kernel: APIC: Static calls initialized
Nov 29 05:35:33 localhost kernel: SMBIOS 2.8 present.
Nov 29 05:35:33 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 29 05:35:33 localhost kernel: Hypervisor detected: KVM
Nov 29 05:35:33 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 29 05:35:33 localhost kernel: kvm-clock: using sched offset of 3260837303 cycles
Nov 29 05:35:33 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 29 05:35:33 localhost kernel: tsc: Detected 2800.000 MHz processor
Nov 29 05:35:33 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 29 05:35:33 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 29 05:35:33 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 29 05:35:33 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 29 05:35:33 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 29 05:35:33 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 29 05:35:33 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 29 05:35:33 localhost kernel: Using GB pages for direct mapping
Nov 29 05:35:33 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 29 05:35:33 localhost kernel: ACPI: Early table checksum verification disabled
Nov 29 05:35:33 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 29 05:35:33 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:33 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:33 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:33 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 29 05:35:33 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:33 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:33 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 29 05:35:33 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 29 05:35:33 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 29 05:35:33 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 29 05:35:33 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 29 05:35:33 localhost kernel: No NUMA configuration found
Nov 29 05:35:33 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 29 05:35:33 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 29 05:35:33 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 29 05:35:33 localhost kernel: Zone ranges:
Nov 29 05:35:33 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 29 05:35:33 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 29 05:35:33 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 05:35:33 localhost kernel:   Device   empty
Nov 29 05:35:33 localhost kernel: Movable zone start for each node
Nov 29 05:35:33 localhost kernel: Early memory node ranges
Nov 29 05:35:33 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 29 05:35:33 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 29 05:35:33 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 05:35:33 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 29 05:35:33 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 29 05:35:33 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 29 05:35:33 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 29 05:35:33 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 29 05:35:33 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 29 05:35:33 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 29 05:35:33 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 29 05:35:33 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 29 05:35:33 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 29 05:35:33 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 29 05:35:33 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 29 05:35:33 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 29 05:35:33 localhost kernel: TSC deadline timer available
Nov 29 05:35:33 localhost kernel: CPU topo: Max. logical packages:   8
Nov 29 05:35:33 localhost kernel: CPU topo: Max. logical dies:       8
Nov 29 05:35:33 localhost kernel: CPU topo: Max. dies per package:   1
Nov 29 05:35:33 localhost kernel: CPU topo: Max. threads per core:   1
Nov 29 05:35:33 localhost kernel: CPU topo: Num. cores per package:     1
Nov 29 05:35:33 localhost kernel: CPU topo: Num. threads per package:   1
Nov 29 05:35:33 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 29 05:35:33 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 29 05:35:33 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 29 05:35:33 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 29 05:35:33 localhost kernel: Booting paravirtualized kernel on KVM
Nov 29 05:35:33 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 29 05:35:33 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 29 05:35:33 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 29 05:35:33 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 29 05:35:33 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 29 05:35:33 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 29 05:35:33 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:33 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 29 05:35:33 localhost kernel: random: crng init done
Nov 29 05:35:33 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 29 05:35:33 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 29 05:35:33 localhost kernel: Fallback order for Node 0: 0 
Nov 29 05:35:33 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 29 05:35:33 localhost kernel: Policy zone: Normal
Nov 29 05:35:33 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 29 05:35:33 localhost kernel: software IO TLB: area num 8.
Nov 29 05:35:33 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 29 05:35:33 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Nov 29 05:35:33 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 29 05:35:33 localhost kernel: Dynamic Preempt: voluntary
Nov 29 05:35:33 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 29 05:35:33 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 29 05:35:33 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 29 05:35:33 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 29 05:35:33 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 29 05:35:33 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 29 05:35:33 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 29 05:35:33 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 29 05:35:33 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:33 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:33 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:33 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 29 05:35:33 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 29 05:35:33 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 29 05:35:33 localhost kernel: Console: colour VGA+ 80x25
Nov 29 05:35:33 localhost kernel: printk: console [ttyS0] enabled
Nov 29 05:35:33 localhost kernel: ACPI: Core revision 20230331
Nov 29 05:35:33 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 29 05:35:33 localhost kernel: x2apic enabled
Nov 29 05:35:33 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 29 05:35:33 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 29 05:35:33 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 29 05:35:33 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 29 05:35:33 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 29 05:35:33 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 29 05:35:33 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 29 05:35:33 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 29 05:35:33 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 29 05:35:33 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 29 05:35:33 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 29 05:35:33 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 29 05:35:33 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 29 05:35:33 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 29 05:35:33 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 29 05:35:33 localhost kernel: x86/bugs: return thunk changed
Nov 29 05:35:33 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 29 05:35:33 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 29 05:35:33 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 29 05:35:33 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 29 05:35:33 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 29 05:35:33 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 29 05:35:33 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 29 05:35:33 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 29 05:35:33 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 29 05:35:33 localhost kernel: landlock: Up and running.
Nov 29 05:35:33 localhost kernel: Yama: becoming mindful.
Nov 29 05:35:33 localhost kernel: SELinux:  Initializing.
Nov 29 05:35:33 localhost kernel: LSM support for eBPF active
Nov 29 05:35:33 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 05:35:33 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 05:35:33 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 29 05:35:33 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 29 05:35:33 localhost kernel: ... version:                0
Nov 29 05:35:33 localhost kernel: ... bit width:              48
Nov 29 05:35:33 localhost kernel: ... generic registers:      6
Nov 29 05:35:33 localhost kernel: ... value mask:             0000ffffffffffff
Nov 29 05:35:33 localhost kernel: ... max period:             00007fffffffffff
Nov 29 05:35:33 localhost kernel: ... fixed-purpose events:   0
Nov 29 05:35:33 localhost kernel: ... event mask:             000000000000003f
Nov 29 05:35:33 localhost kernel: signal: max sigframe size: 1776
Nov 29 05:35:33 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 29 05:35:33 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 29 05:35:33 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 29 05:35:33 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 29 05:35:33 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 29 05:35:33 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 29 05:35:33 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 29 05:35:33 localhost kernel: node 0 deferred pages initialised in 11ms
Nov 29 05:35:33 localhost kernel: Memory: 7766056K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616272K reserved, 0K cma-reserved)
Nov 29 05:35:33 localhost kernel: devtmpfs: initialized
Nov 29 05:35:33 localhost kernel: x86/mm: Memory block size: 128MB
Nov 29 05:35:33 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 29 05:35:33 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 29 05:35:33 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 29 05:35:33 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 29 05:35:33 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 29 05:35:33 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 29 05:35:33 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 29 05:35:33 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 29 05:35:33 localhost kernel: audit: type=2000 audit(1764394531.482:1): state=initialized audit_enabled=0 res=1
Nov 29 05:35:33 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 29 05:35:33 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 29 05:35:33 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 29 05:35:33 localhost kernel: cpuidle: using governor menu
Nov 29 05:35:33 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 29 05:35:33 localhost kernel: PCI: Using configuration type 1 for base access
Nov 29 05:35:33 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 29 05:35:33 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 29 05:35:33 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 29 05:35:33 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 29 05:35:33 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 29 05:35:33 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 29 05:35:33 localhost kernel: Demotion targets for Node 0: null
Nov 29 05:35:33 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 29 05:35:33 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 29 05:35:33 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 29 05:35:33 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 29 05:35:33 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 29 05:35:33 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 29 05:35:33 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 29 05:35:33 localhost kernel: ACPI: Interpreter enabled
Nov 29 05:35:33 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 29 05:35:33 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 29 05:35:33 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 29 05:35:33 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 29 05:35:33 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 29 05:35:33 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 29 05:35:33 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [3] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [4] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [5] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [6] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [7] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [8] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [9] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [10] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [11] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [12] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [13] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [14] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [15] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [16] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [17] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [18] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [19] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [20] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [21] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [22] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [23] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [24] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [25] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [26] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [27] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [28] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [29] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [30] registered
Nov 29 05:35:33 localhost kernel: acpiphp: Slot [31] registered
Nov 29 05:35:33 localhost kernel: PCI host bridge to bus 0000:00
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 29 05:35:33 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 29 05:35:33 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 29 05:35:33 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 29 05:35:33 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 29 05:35:33 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 29 05:35:33 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 29 05:35:33 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 29 05:35:33 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 29 05:35:33 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 29 05:35:33 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 29 05:35:33 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 05:35:33 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 29 05:35:33 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 29 05:35:33 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 29 05:35:33 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 29 05:35:33 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 29 05:35:33 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 29 05:35:33 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 29 05:35:33 localhost kernel: iommu: Default domain type: Translated
Nov 29 05:35:33 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 29 05:35:33 localhost kernel: SCSI subsystem initialized
Nov 29 05:35:33 localhost kernel: ACPI: bus type USB registered
Nov 29 05:35:33 localhost kernel: usbcore: registered new interface driver usbfs
Nov 29 05:35:33 localhost kernel: usbcore: registered new interface driver hub
Nov 29 05:35:33 localhost kernel: usbcore: registered new device driver usb
Nov 29 05:35:33 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 29 05:35:33 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 29 05:35:33 localhost kernel: PTP clock support registered
Nov 29 05:35:33 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 29 05:35:33 localhost kernel: NetLabel: Initializing
Nov 29 05:35:33 localhost kernel: NetLabel:  domain hash size = 128
Nov 29 05:35:33 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 29 05:35:33 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 29 05:35:33 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 29 05:35:33 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 29 05:35:33 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 29 05:35:33 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 29 05:35:33 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 29 05:35:33 localhost kernel: vgaarb: loaded
Nov 29 05:35:33 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 29 05:35:33 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 29 05:35:33 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 29 05:35:33 localhost kernel: pnp: PnP ACPI init
Nov 29 05:35:33 localhost kernel: pnp 00:03: [dma 2]
Nov 29 05:35:33 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 29 05:35:33 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 29 05:35:33 localhost kernel: NET: Registered PF_INET protocol family
Nov 29 05:35:33 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 29 05:35:33 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 29 05:35:33 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 29 05:35:33 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 29 05:35:33 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 29 05:35:33 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 29 05:35:33 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 29 05:35:33 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 05:35:33 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 05:35:33 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 29 05:35:33 localhost kernel: NET: Registered PF_XDP protocol family
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 29 05:35:33 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 29 05:35:33 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 29 05:35:33 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 29 05:35:33 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 71681 usecs
Nov 29 05:35:33 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 29 05:35:33 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 29 05:35:33 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 29 05:35:33 localhost kernel: ACPI: bus type thunderbolt registered
Nov 29 05:35:33 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 29 05:35:33 localhost kernel: Initialise system trusted keyrings
Nov 29 05:35:33 localhost kernel: Key type blacklist registered
Nov 29 05:35:33 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 29 05:35:33 localhost kernel: zbud: loaded
Nov 29 05:35:33 localhost kernel: integrity: Platform Keyring initialized
Nov 29 05:35:33 localhost kernel: integrity: Machine keyring initialized
Nov 29 05:35:33 localhost kernel: Freeing initrd memory: 85868K
Nov 29 05:35:33 localhost kernel: NET: Registered PF_ALG protocol family
Nov 29 05:35:33 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 29 05:35:33 localhost kernel: Key type asymmetric registered
Nov 29 05:35:33 localhost kernel: Asymmetric key parser 'x509' registered
Nov 29 05:35:33 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 29 05:35:33 localhost kernel: io scheduler mq-deadline registered
Nov 29 05:35:33 localhost kernel: io scheduler kyber registered
Nov 29 05:35:33 localhost kernel: io scheduler bfq registered
Nov 29 05:35:33 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 29 05:35:33 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 29 05:35:33 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 29 05:35:33 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 29 05:35:33 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 29 05:35:33 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 29 05:35:33 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 29 05:35:33 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 29 05:35:33 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 29 05:35:33 localhost kernel: Non-volatile memory driver v1.3
Nov 29 05:35:33 localhost kernel: rdac: device handler registered
Nov 29 05:35:33 localhost kernel: hp_sw: device handler registered
Nov 29 05:35:33 localhost kernel: emc: device handler registered
Nov 29 05:35:33 localhost kernel: alua: device handler registered
Nov 29 05:35:33 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 29 05:35:33 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 29 05:35:33 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 29 05:35:33 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 29 05:35:33 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 29 05:35:33 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 29 05:35:33 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 29 05:35:33 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 29 05:35:33 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 29 05:35:33 localhost kernel: hub 1-0:1.0: USB hub found
Nov 29 05:35:33 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 29 05:35:33 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 29 05:35:33 localhost kernel: usbserial: USB Serial support registered for generic
Nov 29 05:35:33 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 29 05:35:33 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 29 05:35:33 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 29 05:35:33 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 29 05:35:33 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 29 05:35:33 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 29 05:35:33 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 29 05:35:33 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-29T05:35:32 UTC (1764394532)
Nov 29 05:35:33 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 29 05:35:33 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 29 05:35:33 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 29 05:35:33 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 29 05:35:33 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 29 05:35:33 localhost kernel: usbcore: registered new interface driver usbhid
Nov 29 05:35:33 localhost kernel: usbhid: USB HID core driver
Nov 29 05:35:33 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 29 05:35:33 localhost kernel: Initializing XFRM netlink socket
Nov 29 05:35:33 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 29 05:35:33 localhost kernel: Segment Routing with IPv6
Nov 29 05:35:33 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 29 05:35:33 localhost kernel: mpls_gso: MPLS GSO support
Nov 29 05:35:33 localhost kernel: IPI shorthand broadcast: enabled
Nov 29 05:35:33 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 29 05:35:33 localhost kernel: AES CTR mode by8 optimization enabled
Nov 29 05:35:33 localhost kernel: sched_clock: Marking stable (1236008419, 144514150)->(1454172619, -73650050)
Nov 29 05:35:33 localhost kernel: registered taskstats version 1
Nov 29 05:35:33 localhost kernel: Loading compiled-in X.509 certificates
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 29 05:35:33 localhost kernel: Demotion targets for Node 0: null
Nov 29 05:35:33 localhost kernel: page_owner is disabled
Nov 29 05:35:33 localhost kernel: Key type .fscrypt registered
Nov 29 05:35:33 localhost kernel: Key type fscrypt-provisioning registered
Nov 29 05:35:33 localhost kernel: Key type big_key registered
Nov 29 05:35:33 localhost kernel: Key type encrypted registered
Nov 29 05:35:33 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 29 05:35:33 localhost kernel: Loading compiled-in module X.509 certificates
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 05:35:33 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 29 05:35:33 localhost kernel: ima: No architecture policies found
Nov 29 05:35:33 localhost kernel: evm: Initialising EVM extended attributes:
Nov 29 05:35:33 localhost kernel: evm: security.selinux
Nov 29 05:35:33 localhost kernel: evm: security.SMACK64 (disabled)
Nov 29 05:35:33 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 29 05:35:33 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 29 05:35:33 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 29 05:35:33 localhost kernel: evm: security.apparmor (disabled)
Nov 29 05:35:33 localhost kernel: evm: security.ima
Nov 29 05:35:33 localhost kernel: evm: security.capability
Nov 29 05:35:33 localhost kernel: evm: HMAC attrs: 0x1
Nov 29 05:35:33 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 29 05:35:33 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 29 05:35:33 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 29 05:35:33 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 29 05:35:33 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 29 05:35:33 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 29 05:35:33 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 29 05:35:33 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 29 05:35:33 localhost kernel: Running certificate verification RSA selftest
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 29 05:35:33 localhost kernel: Running certificate verification ECDSA selftest
Nov 29 05:35:33 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 29 05:35:33 localhost kernel: clk: Disabling unused clocks
Nov 29 05:35:33 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 29 05:35:33 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 29 05:35:33 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 29 05:35:33 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 29 05:35:33 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 29 05:35:33 localhost kernel: Run /init as init process
Nov 29 05:35:33 localhost kernel:   with arguments:
Nov 29 05:35:33 localhost kernel:     /init
Nov 29 05:35:33 localhost kernel:   with environment:
Nov 29 05:35:33 localhost kernel:     HOME=/
Nov 29 05:35:33 localhost kernel:     TERM=linux
Nov 29 05:35:33 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Nov 29 05:35:33 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 05:35:33 localhost systemd[1]: Detected virtualization kvm.
Nov 29 05:35:33 localhost systemd[1]: Detected architecture x86-64.
Nov 29 05:35:33 localhost systemd[1]: Running in initrd.
Nov 29 05:35:33 localhost systemd[1]: No hostname configured, using default hostname.
Nov 29 05:35:33 localhost systemd[1]: Hostname set to <localhost>.
Nov 29 05:35:33 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 29 05:35:33 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 29 05:35:33 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:33 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 29 05:35:33 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 29 05:35:33 localhost systemd[1]: Reached target Local File Systems.
Nov 29 05:35:33 localhost systemd[1]: Reached target Path Units.
Nov 29 05:35:33 localhost systemd[1]: Reached target Slice Units.
Nov 29 05:35:33 localhost systemd[1]: Reached target Swaps.
Nov 29 05:35:33 localhost systemd[1]: Reached target Timer Units.
Nov 29 05:35:33 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 05:35:33 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 29 05:35:33 localhost systemd[1]: Listening on Journal Socket.
Nov 29 05:35:33 localhost systemd[1]: Listening on udev Control Socket.
Nov 29 05:35:33 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 29 05:35:33 localhost systemd[1]: Reached target Socket Units.
Nov 29 05:35:33 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 29 05:35:33 localhost systemd[1]: Starting Journal Service...
Nov 29 05:35:33 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 05:35:33 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 29 05:35:33 localhost systemd[1]: Starting Create System Users...
Nov 29 05:35:33 localhost systemd[1]: Starting Setup Virtual Console...
Nov 29 05:35:33 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 05:35:33 localhost systemd-journald[303]: Journal started
Nov 29 05:35:33 localhost systemd-journald[303]: Runtime Journal (/run/log/journal/4a1784f42c5f4879a5f6acc886e56ebb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:33 localhost systemd[1]: Started Journal Service.
Nov 29 05:35:33 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 29 05:35:33 localhost systemd-sysusers[307]: Creating group 'users' with GID 100.
Nov 29 05:35:33 localhost systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Nov 29 05:35:33 localhost systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 29 05:35:33 localhost systemd[1]: Finished Create System Users.
Nov 29 05:35:33 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 05:35:33 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 05:35:33 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 05:35:33 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 05:35:33 localhost systemd[1]: Finished Setup Virtual Console.
Nov 29 05:35:33 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 29 05:35:33 localhost systemd[1]: Starting dracut cmdline hook...
Nov 29 05:35:33 localhost dracut-cmdline[322]: dracut-9 dracut-057-102.git20250818.el9
Nov 29 05:35:33 localhost dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:33 localhost systemd[1]: Finished dracut cmdline hook.
Nov 29 05:35:33 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 29 05:35:33 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 29 05:35:33 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 29 05:35:33 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 29 05:35:33 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 29 05:35:33 localhost kernel: RPC: Registered udp transport module.
Nov 29 05:35:33 localhost kernel: RPC: Registered tcp transport module.
Nov 29 05:35:33 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 29 05:35:33 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 29 05:35:34 localhost rpc.statd[440]: Version 2.5.4 starting
Nov 29 05:35:34 localhost rpc.statd[440]: Initializing NSM state
Nov 29 05:35:34 localhost rpc.idmapd[445]: Setting log level to 0
Nov 29 05:35:34 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 29 05:35:34 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 05:35:34 localhost systemd-udevd[458]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 05:35:34 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 05:35:34 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 29 05:35:34 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 29 05:35:34 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 29 05:35:34 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 29 05:35:34 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:34 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 29 05:35:34 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:34 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:34 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 29 05:35:34 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 05:35:34 localhost systemd[1]: Reached target Network.
Nov 29 05:35:34 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 05:35:34 localhost systemd[1]: Starting dracut initqueue hook...
Nov 29 05:35:34 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 29 05:35:34 localhost systemd[1]: Reached target System Initialization.
Nov 29 05:35:34 localhost systemd[1]: Reached target Basic System.
Nov 29 05:35:34 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 29 05:35:34 localhost kernel: libata version 3.00 loaded.
Nov 29 05:35:34 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 29 05:35:34 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 29 05:35:34 localhost kernel: scsi host0: ata_piix
Nov 29 05:35:34 localhost kernel:  vda: vda1
Nov 29 05:35:34 localhost kernel: scsi host1: ata_piix
Nov 29 05:35:34 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 29 05:35:34 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 29 05:35:34 localhost kernel: ata1: found unknown device (class 0)
Nov 29 05:35:34 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 29 05:35:34 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 29 05:35:34 localhost systemd-udevd[476]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:35:34 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 29 05:35:34 localhost systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 05:35:34 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 29 05:35:34 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 29 05:35:34 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 29 05:35:34 localhost systemd[1]: Reached target Initrd Root Device.
Nov 29 05:35:34 localhost systemd[1]: Finished dracut initqueue hook.
Nov 29 05:35:34 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 05:35:34 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 29 05:35:34 localhost systemd[1]: Reached target Remote File Systems.
Nov 29 05:35:34 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 29 05:35:34 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 29 05:35:34 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 29 05:35:34 localhost systemd-fsck[553]: /usr/sbin/fsck.xfs: XFS file system.
Nov 29 05:35:34 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 05:35:34 localhost systemd[1]: Mounting /sysroot...
Nov 29 05:35:35 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 29 05:35:35 localhost kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 29 05:35:35 localhost kernel: XFS (vda1): Ending clean mount
Nov 29 05:35:35 localhost systemd[1]: Mounted /sysroot.
Nov 29 05:35:35 localhost systemd[1]: Reached target Initrd Root File System.
Nov 29 05:35:35 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 29 05:35:35 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 29 05:35:35 localhost systemd[1]: Reached target Initrd File Systems.
Nov 29 05:35:35 localhost systemd[1]: Reached target Initrd Default Target.
Nov 29 05:35:35 localhost systemd[1]: Starting dracut mount hook...
Nov 29 05:35:35 localhost systemd[1]: Finished dracut mount hook.
Nov 29 05:35:35 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 29 05:35:35 localhost rpc.idmapd[445]: exiting on signal 15
Nov 29 05:35:35 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 29 05:35:35 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 29 05:35:35 localhost systemd[1]: Stopped target Network.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Timer Units.
Nov 29 05:35:35 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 29 05:35:35 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Basic System.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Path Units.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Remote File Systems.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Slice Units.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Socket Units.
Nov 29 05:35:35 localhost systemd[1]: Stopped target System Initialization.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Local File Systems.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Swaps.
Nov 29 05:35:35 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped dracut mount hook.
Nov 29 05:35:35 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 29 05:35:35 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 29 05:35:35 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:35 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 29 05:35:35 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 29 05:35:35 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 29 05:35:35 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 29 05:35:35 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 29 05:35:35 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 29 05:35:35 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 29 05:35:35 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 29 05:35:35 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 29 05:35:35 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Closed udev Control Socket.
Nov 29 05:35:35 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Closed udev Kernel Socket.
Nov 29 05:35:35 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 29 05:35:35 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 29 05:35:35 localhost systemd[1]: Starting Cleanup udev Database...
Nov 29 05:35:35 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 29 05:35:35 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 29 05:35:35 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Stopped Create System Users.
Nov 29 05:35:35 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 29 05:35:35 localhost systemd[1]: Finished Cleanup udev Database.
Nov 29 05:35:35 localhost systemd[1]: Reached target Switch Root.
Nov 29 05:35:35 localhost systemd[1]: Starting Switch Root...
Nov 29 05:35:35 localhost systemd[1]: Switching root.
Nov 29 05:35:35 localhost systemd-journald[303]: Journal stopped
Nov 29 05:35:36 localhost systemd-journald[303]: Received SIGTERM from PID 1 (systemd).
Nov 29 05:35:36 localhost kernel: audit: type=1404 audit(1764394535.826:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 29 05:35:36 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:35:36 localhost kernel: SELinux:  policy capability open_perms=1
Nov 29 05:35:36 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:35:36 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:35:36 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:35:36 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:35:36 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:35:36 localhost kernel: audit: type=1403 audit(1764394535.949:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 29 05:35:36 localhost systemd[1]: Successfully loaded SELinux policy in 125.420ms.
Nov 29 05:35:36 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 32.913ms.
Nov 29 05:35:36 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 05:35:36 localhost systemd[1]: Detected virtualization kvm.
Nov 29 05:35:36 localhost systemd[1]: Detected architecture x86-64.
Nov 29 05:35:36 localhost systemd-rc-local-generator[635]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:35:36 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 29 05:35:36 localhost systemd[1]: Stopped Switch Root.
Nov 29 05:35:36 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 29 05:35:36 localhost systemd[1]: Created slice Slice /system/getty.
Nov 29 05:35:36 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 29 05:35:36 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 29 05:35:36 localhost systemd[1]: Created slice User and Session Slice.
Nov 29 05:35:36 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:36 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 29 05:35:36 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 29 05:35:36 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 29 05:35:36 localhost systemd[1]: Stopped target Switch Root.
Nov 29 05:35:36 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 29 05:35:36 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 29 05:35:36 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 29 05:35:36 localhost systemd[1]: Reached target Path Units.
Nov 29 05:35:36 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 29 05:35:36 localhost systemd[1]: Reached target Slice Units.
Nov 29 05:35:36 localhost systemd[1]: Reached target Swaps.
Nov 29 05:35:36 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 29 05:35:36 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 29 05:35:36 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 29 05:35:36 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 29 05:35:36 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 29 05:35:36 localhost systemd[1]: Listening on udev Control Socket.
Nov 29 05:35:36 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 29 05:35:36 localhost systemd[1]: Mounting Huge Pages File System...
Nov 29 05:35:36 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 29 05:35:36 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 29 05:35:36 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 29 05:35:36 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 05:35:36 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 29 05:35:36 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:36 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 29 05:35:36 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 29 05:35:36 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 29 05:35:36 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 29 05:35:36 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 29 05:35:36 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 29 05:35:36 localhost systemd[1]: Stopped Journal Service.
Nov 29 05:35:36 localhost systemd[1]: Starting Journal Service...
Nov 29 05:35:36 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 05:35:36 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 29 05:35:36 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:36 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 29 05:35:36 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 29 05:35:36 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 29 05:35:36 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 29 05:35:36 localhost kernel: ACPI: bus type drm_connector registered
Nov 29 05:35:36 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 29 05:35:36 localhost kernel: fuse: init (API version 7.37)
Nov 29 05:35:36 localhost systemd-journald[676]: Journal started
Nov 29 05:35:36 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:36 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 29 05:35:36 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 29 05:35:36 localhost systemd[1]: Mounted Huge Pages File System.
Nov 29 05:35:36 localhost systemd[1]: Started Journal Service.
Nov 29 05:35:36 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 29 05:35:36 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 29 05:35:36 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 29 05:35:36 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 05:35:36 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:36 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:36 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 29 05:35:36 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 29 05:35:36 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 29 05:35:36 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 29 05:35:36 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 29 05:35:36 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 29 05:35:36 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 29 05:35:36 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 29 05:35:36 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 29 05:35:36 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 29 05:35:36 localhost systemd[1]: Mounting FUSE Control File System...
Nov 29 05:35:36 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 05:35:36 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 29 05:35:36 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 29 05:35:36 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 29 05:35:36 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 29 05:35:36 localhost systemd[1]: Starting Create System Users...
Nov 29 05:35:36 localhost systemd[1]: Mounted FUSE Control File System.
Nov 29 05:35:36 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:36 localhost systemd-journald[676]: Received client request to flush runtime journal.
Nov 29 05:35:36 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 29 05:35:36 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 29 05:35:36 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 05:35:36 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 29 05:35:36 localhost systemd[1]: Finished Create System Users.
Nov 29 05:35:36 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 05:35:36 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 05:35:36 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 29 05:35:36 localhost systemd[1]: Reached target Local File Systems.
Nov 29 05:35:36 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 29 05:35:36 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 29 05:35:36 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 29 05:35:36 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 29 05:35:36 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 29 05:35:36 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 29 05:35:36 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 05:35:36 localhost bootctl[694]: Couldn't find EFI system partition, skipping.
Nov 29 05:35:36 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 29 05:35:36 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 05:35:36 localhost systemd[1]: Starting Security Auditing Service...
Nov 29 05:35:36 localhost systemd[1]: Starting RPC Bind...
Nov 29 05:35:36 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 29 05:35:36 localhost auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 29 05:35:36 localhost auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 29 05:35:36 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 29 05:35:36 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 29 05:35:36 localhost augenrules[705]: /sbin/augenrules: No change
Nov 29 05:35:36 localhost systemd[1]: Started RPC Bind.
Nov 29 05:35:36 localhost augenrules[720]: No rules
Nov 29 05:35:36 localhost augenrules[720]: enabled 1
Nov 29 05:35:36 localhost augenrules[720]: failure 1
Nov 29 05:35:36 localhost augenrules[720]: pid 699
Nov 29 05:35:36 localhost augenrules[720]: rate_limit 0
Nov 29 05:35:36 localhost augenrules[720]: backlog_limit 8192
Nov 29 05:35:36 localhost augenrules[720]: lost 0
Nov 29 05:35:36 localhost augenrules[720]: backlog 4
Nov 29 05:35:36 localhost augenrules[720]: backlog_wait_time 60000
Nov 29 05:35:36 localhost augenrules[720]: backlog_wait_time_actual 0
Nov 29 05:35:36 localhost augenrules[720]: enabled 1
Nov 29 05:35:36 localhost augenrules[720]: failure 1
Nov 29 05:35:36 localhost augenrules[720]: pid 699
Nov 29 05:35:36 localhost augenrules[720]: rate_limit 0
Nov 29 05:35:36 localhost augenrules[720]: backlog_limit 8192
Nov 29 05:35:36 localhost augenrules[720]: lost 0
Nov 29 05:35:36 localhost augenrules[720]: backlog 1
Nov 29 05:35:36 localhost augenrules[720]: backlog_wait_time 60000
Nov 29 05:35:36 localhost augenrules[720]: backlog_wait_time_actual 0
Nov 29 05:35:36 localhost augenrules[720]: enabled 1
Nov 29 05:35:36 localhost augenrules[720]: failure 1
Nov 29 05:35:36 localhost augenrules[720]: pid 699
Nov 29 05:35:36 localhost augenrules[720]: rate_limit 0
Nov 29 05:35:36 localhost augenrules[720]: backlog_limit 8192
Nov 29 05:35:36 localhost augenrules[720]: lost 0
Nov 29 05:35:36 localhost augenrules[720]: backlog 0
Nov 29 05:35:36 localhost augenrules[720]: backlog_wait_time 60000
Nov 29 05:35:36 localhost augenrules[720]: backlog_wait_time_actual 0
Nov 29 05:35:36 localhost systemd[1]: Started Security Auditing Service.
Nov 29 05:35:36 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 29 05:35:37 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 29 05:35:37 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 29 05:35:37 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 05:35:37 localhost systemd[1]: Starting Update is Completed...
Nov 29 05:35:37 localhost systemd[1]: Finished Update is Completed.
Nov 29 05:35:37 localhost systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 05:35:37 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 05:35:37 localhost systemd[1]: Reached target System Initialization.
Nov 29 05:35:37 localhost systemd[1]: Started dnf makecache --timer.
Nov 29 05:35:37 localhost systemd[1]: Started Daily rotation of log files.
Nov 29 05:35:37 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 29 05:35:37 localhost systemd[1]: Reached target Timer Units.
Nov 29 05:35:37 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 05:35:37 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 29 05:35:37 localhost systemd[1]: Reached target Socket Units.
Nov 29 05:35:37 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 29 05:35:37 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:37 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:37 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:37 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:37 localhost systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:35:37 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 29 05:35:37 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 29 05:35:37 localhost systemd[1]: Reached target Basic System.
Nov 29 05:35:37 localhost dbus-broker-lau[765]: Ready
Nov 29 05:35:37 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 29 05:35:37 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 29 05:35:37 localhost systemd[1]: Starting NTP client/server...
Nov 29 05:35:37 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 29 05:35:37 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 29 05:35:37 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 29 05:35:37 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 29 05:35:37 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 29 05:35:37 localhost systemd[1]: Started irqbalance daemon.
Nov 29 05:35:37 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 29 05:35:37 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:37 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:37 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:37 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 29 05:35:37 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 29 05:35:37 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 29 05:35:37 localhost systemd[1]: Starting User Login Management...
Nov 29 05:35:37 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 29 05:35:37 localhost chronyd[787]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 05:35:37 localhost chronyd[787]: Loaded 0 symmetric keys
Nov 29 05:35:37 localhost chronyd[787]: Using right/UTC timezone to obtain leap second data
Nov 29 05:35:37 localhost chronyd[787]: Loaded seccomp filter (level 2)
Nov 29 05:35:37 localhost systemd[1]: Started NTP client/server.
Nov 29 05:35:37 localhost systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 05:35:37 localhost systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 05:35:37 localhost systemd-logind[784]: New seat seat0.
Nov 29 05:35:37 localhost systemd[1]: Started User Login Management.
Nov 29 05:35:37 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 29 05:35:37 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 29 05:35:37 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 29 05:35:37 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 29 05:35:37 localhost kernel: Console: switching to colour dummy device 80x25
Nov 29 05:35:37 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 29 05:35:37 localhost kernel: [drm] features: -context_init
Nov 29 05:35:37 localhost kernel: [drm] number of scanouts: 1
Nov 29 05:35:37 localhost kernel: [drm] number of cap sets: 0
Nov 29 05:35:37 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 29 05:35:37 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 29 05:35:37 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 29 05:35:37 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 29 05:35:37 localhost kernel: kvm_amd: TSC scaling supported
Nov 29 05:35:37 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 29 05:35:37 localhost kernel: kvm_amd: Nested Paging enabled
Nov 29 05:35:37 localhost kernel: kvm_amd: LBR virtualization supported
Nov 29 05:35:37 localhost iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Nov 29 05:35:37 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 29 05:35:37 localhost cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 29 Nov 2025 05:35:37 +0000. Up 6.56 seconds.
Nov 29 05:35:38 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 29 05:35:38 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 29 05:35:38 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpqai37alp.mount: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Starting Hostname Service...
Nov 29 05:35:38 localhost systemd[1]: Started Hostname Service.
Nov 29 05:35:38 np0005539510.novalocal systemd-hostnamed[853]: Hostname set to <np0005539510.novalocal> (static)
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Reached target Preparation for Network.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Starting Network Manager...
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.4849] NetworkManager (version 1.54.1-1.el9) is starting... (boot:631d2949-c1d4-4f67-afc4-db082a3ff43a)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.4856] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.4943] manager[0x563c469b8080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.4984] hostname: hostname: using hostnamed
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.4984] hostname: static hostname changed from (none) to "np0005539510.novalocal"
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.4989] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5160] manager[0x563c469b8080]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5160] manager[0x563c469b8080]: rfkill: WWAN hardware radio set enabled
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5210] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5210] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5212] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5212] manager: Networking is enabled by state file
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5214] settings: Loaded settings plugin: keyfile (internal)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5226] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5259] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5280] dhcp: init: Using DHCP client 'internal'
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5283] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5301] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5311] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5321] device (lo): Activation: starting connection 'lo' (5b6e73d6-4c36-495c-9d49-56d866cbd8e2)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5333] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5337] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5376] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5384] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5386] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5390] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5392] device (eth0): carrier: link connected
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5394] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5405] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5417] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5441] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5443] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5445] manager: NetworkManager state is now CONNECTING
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5447] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5458] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5461] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Started Network Manager.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Reached target Network.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5672] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5676] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.5692] device (lo): Activation: successful, device activated.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Reached target NFS client services.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Reached target Remote File Systems.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7224] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7237] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7259] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7309] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7313] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7320] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7327] device (eth0): Activation: successful, device activated.
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7338] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 05:35:38 np0005539510.novalocal NetworkManager[857]: <info>  [1764394538.7344] manager: startup complete
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 29 05:35:38 np0005539510.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 29 Nov 2025 05:35:39 +0000. Up 7.72 seconds.
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |  eth0  | True |         38.102.83.94         | 255.255.255.0 | global | fa:16:3e:62:80:6f |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe62:806f/64 |       .       |  link  | fa:16:3e:62:80:6f |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 29 05:35:39 np0005539510.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:35:40 np0005539510.novalocal useradd[986]: new group: name=cloud-user, GID=1001
Nov 29 05:35:40 np0005539510.novalocal useradd[986]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 29 05:35:40 np0005539510.novalocal useradd[986]: add 'cloud-user' to group 'adm'
Nov 29 05:35:40 np0005539510.novalocal useradd[986]: add 'cloud-user' to group 'systemd-journal'
Nov 29 05:35:40 np0005539510.novalocal useradd[986]: add 'cloud-user' to shadow group 'adm'
Nov 29 05:35:40 np0005539510.novalocal useradd[986]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Generating public/private rsa key pair.
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: The key fingerprint is:
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: SHA256:RCpG0KO3RkDTAwx6sae5aoVt8m1SH/n2QdTs/YFRGOM root@np0005539510.novalocal
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: The key's randomart image is:
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: +---[RSA 3072]----+
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |.+*=.   .    oo. |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |. o==  o    +.o  |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |. oo+o. .  . E   |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: | ..=o. .  . . +  |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |  =o .  S  . o o |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: | o =o. o  .     o|
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |  *.o . o  .    .|
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: | o o o . o  .    |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |o   o   . ..     |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: +----[SHA256]-----+
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Generating public/private ecdsa key pair.
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: The key fingerprint is:
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: SHA256:nG6jfywUu6AsvzEUvN5IQcVxqXbc/fw8Kbp0W8eiUhM root@np0005539510.novalocal
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: The key's randomart image is:
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: +---[ECDSA 256]---+
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |    .oo...       |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |   o  ...        |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |    +  o . .     |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |     +o.+.. E    |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |    +. .So   +   |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |   + o..o   o o. |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |   .=..o+o o o.++|
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |  . oo oo.= .oo++|
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |   oo....o ++.. .|
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: +----[SHA256]-----+
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Generating public/private ed25519 key pair.
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: The key fingerprint is:
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: SHA256:mASfPWBs+dgCD81awWsnWl9vNcMu1QkamaXtCAzGVEw root@np0005539510.novalocal
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: The key's randomart image is:
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: +--[ED25519 256]--+
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |    .=**+E  +.   |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |    o+X=o. +o.   |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |     B+=oo .oo...|
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |    ..Oo+.o.o *..|
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |     +o=S. o = o |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |    .   .   + .  |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |           . .   |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |                 |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: |                 |
Nov 29 05:35:40 np0005539510.novalocal cloud-init[920]: +----[SHA256]-----+
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Reached target Network is Online.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting System Logging Service...
Nov 29 05:35:40 np0005539510.novalocal sm-notify[1002]: Version 2.5.4 starting
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting Permit User Sessions...
Nov 29 05:35:40 np0005539510.novalocal sshd[1004]: Server listening on 0.0.0.0 port 22.
Nov 29 05:35:40 np0005539510.novalocal sshd[1004]: Server listening on :: port 22.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Finished Permit User Sessions.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Started Command Scheduler.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Started Getty on tty1.
Nov 29 05:35:40 np0005539510.novalocal crond[1007]: (CRON) STARTUP (1.5.7)
Nov 29 05:35:40 np0005539510.novalocal crond[1007]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 29 05:35:40 np0005539510.novalocal crond[1007]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 3% if used.)
Nov 29 05:35:40 np0005539510.novalocal crond[1007]: (CRON) INFO (running with inotify support)
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Reached target Login Prompts.
Nov 29 05:35:40 np0005539510.novalocal rsyslogd[1003]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1003" x-info="https://www.rsyslog.com"] start
Nov 29 05:35:40 np0005539510.novalocal rsyslogd[1003]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Started System Logging Service.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Reached target Multi-User System.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 29 05:35:40 np0005539510.novalocal rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 05:35:40 np0005539510.novalocal kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Nov 29 05:35:40 np0005539510.novalocal kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 29 05:35:40 np0005539510.novalocal cloud-init[1076]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 29 Nov 2025 05:35:40 +0000. Up 9.39 seconds.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 29 05:35:40 np0005539510.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 29 05:35:40 np0005539510.novalocal sshd-session[1145]: Unable to negotiate with 38.102.83.114 port 58804: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 29 05:35:40 np0005539510.novalocal sshd-session[1162]: Unable to negotiate with 38.102.83.114 port 58820: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 29 05:35:40 np0005539510.novalocal sshd-session[1172]: Unable to negotiate with 38.102.83.114 port 58822: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 29 05:35:41 np0005539510.novalocal sshd-session[1183]: Connection reset by 38.102.83.114 port 58834 [preauth]
Nov 29 05:35:41 np0005539510.novalocal sshd-session[1133]: Connection closed by 38.102.83.114 port 58796 [preauth]
Nov 29 05:35:41 np0005539510.novalocal sshd-session[1196]: Connection reset by 38.102.83.114 port 58838 [preauth]
Nov 29 05:35:41 np0005539510.novalocal sshd-session[1211]: Unable to negotiate with 38.102.83.114 port 58844: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 29 05:35:41 np0005539510.novalocal sshd-session[1155]: Connection closed by 38.102.83.114 port 58808 [preauth]
Nov 29 05:35:41 np0005539510.novalocal sshd-session[1218]: Unable to negotiate with 38.102.83.114 port 58860: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1280]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 29 Nov 2025 05:35:41 +0000. Up 9.81 seconds.
Nov 29 05:35:41 np0005539510.novalocal dracut[1284]: dracut-057-102.git20250818.el9
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1301]: #############################################################
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1302]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1304]: 256 SHA256:nG6jfywUu6AsvzEUvN5IQcVxqXbc/fw8Kbp0W8eiUhM root@np0005539510.novalocal (ECDSA)
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1306]: 256 SHA256:mASfPWBs+dgCD81awWsnWl9vNcMu1QkamaXtCAzGVEw root@np0005539510.novalocal (ED25519)
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1308]: 3072 SHA256:RCpG0KO3RkDTAwx6sae5aoVt8m1SH/n2QdTs/YFRGOM root@np0005539510.novalocal (RSA)
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1309]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1310]: #############################################################
Nov 29 05:35:41 np0005539510.novalocal cloud-init[1280]: Cloud-init v. 24.4-7.el9 finished at Sat, 29 Nov 2025 05:35:41 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.00 seconds
Nov 29 05:35:41 np0005539510.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 29 05:35:41 np0005539510.novalocal systemd[1]: Reached target Cloud-init target.
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 05:35:41 np0005539510.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: memstrack is not available
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: memstrack is not available
Nov 29 05:35:42 np0005539510.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 05:35:43 np0005539510.novalocal dracut[1286]: *** Including module: systemd ***
Nov 29 05:35:43 np0005539510.novalocal dracut[1286]: *** Including module: fips ***
Nov 29 05:35:43 np0005539510.novalocal dracut[1286]: *** Including module: systemd-initrd ***
Nov 29 05:35:43 np0005539510.novalocal chronyd[787]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Nov 29 05:35:43 np0005539510.novalocal chronyd[787]: System clock TAI offset set to 37 seconds
Nov 29 05:35:43 np0005539510.novalocal dracut[1286]: *** Including module: i18n ***
Nov 29 05:35:43 np0005539510.novalocal dracut[1286]: *** Including module: drm ***
Nov 29 05:35:44 np0005539510.novalocal dracut[1286]: *** Including module: prefixdevname ***
Nov 29 05:35:44 np0005539510.novalocal dracut[1286]: *** Including module: kernel-modules ***
Nov 29 05:35:44 np0005539510.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]: *** Including module: kernel-modules-extra ***
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]: *** Including module: qemu ***
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]: *** Including module: fstab-sys ***
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]: *** Including module: rootfs-block ***
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]: *** Including module: terminfo ***
Nov 29 05:35:45 np0005539510.novalocal dracut[1286]: *** Including module: udev-rules ***
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: Skipping udev rule: 91-permissions.rules
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: *** Including module: virtiofs ***
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: *** Including module: dracut-systemd ***
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: *** Including module: usrmount ***
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: *** Including module: base ***
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: *** Including module: fs-lib ***
Nov 29 05:35:46 np0005539510.novalocal dracut[1286]: *** Including module: kdumpbase ***
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:   microcode_ctl module: mangling fw_dir
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 29 05:35:47 np0005539510.novalocal dracut[1286]: *** Including module: openssl ***
Nov 29 05:35:48 np0005539510.novalocal dracut[1286]: *** Including module: shutdown ***
Nov 29 05:35:48 np0005539510.novalocal dracut[1286]: *** Including module: squash ***
Nov 29 05:35:48 np0005539510.novalocal dracut[1286]: *** Including modules done ***
Nov 29 05:35:48 np0005539510.novalocal dracut[1286]: *** Installing kernel module dependencies ***
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: IRQ 25 affinity is now unmanaged
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: IRQ 31 affinity is now unmanaged
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: IRQ 28 affinity is now unmanaged
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: IRQ 32 affinity is now unmanaged
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: IRQ 30 affinity is now unmanaged
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 29 05:35:48 np0005539510.novalocal irqbalance[780]: IRQ 29 affinity is now unmanaged
Nov 29 05:35:48 np0005539510.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:35:49 np0005539510.novalocal dracut[1286]: *** Installing kernel module dependencies done ***
Nov 29 05:35:49 np0005539510.novalocal dracut[1286]: *** Resolving executable dependencies ***
Nov 29 05:35:51 np0005539510.novalocal dracut[1286]: *** Resolving executable dependencies done ***
Nov 29 05:35:51 np0005539510.novalocal dracut[1286]: *** Generating early-microcode cpio image ***
Nov 29 05:35:51 np0005539510.novalocal dracut[1286]: *** Store current command line parameters ***
Nov 29 05:35:51 np0005539510.novalocal dracut[1286]: Stored kernel commandline:
Nov 29 05:35:51 np0005539510.novalocal dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Nov 29 05:35:51 np0005539510.novalocal dracut[1286]: *** Install squash loader ***
Nov 29 05:35:52 np0005539510.novalocal dracut[1286]: *** Squashing the files inside the initramfs ***
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: *** Squashing the files inside the initramfs done ***
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: *** Hardlinking files ***
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: Mode:           real
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: Files:          50
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: Linked:         0 files
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: Compared:       0 xattrs
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: Compared:       0 files
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: Saved:          0 B
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: Duration:       0.000511 seconds
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: *** Hardlinking files done ***
Nov 29 05:35:53 np0005539510.novalocal dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 29 05:35:54 np0005539510.novalocal kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Nov 29 05:35:54 np0005539510.novalocal kdumpctl[1015]: kdump: Starting kdump: [OK]
Nov 29 05:35:54 np0005539510.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 29 05:35:54 np0005539510.novalocal systemd[1]: Startup finished in 1.667s (kernel) + 2.814s (initrd) + 18.523s (userspace) = 23.004s.
Nov 29 05:36:08 np0005539510.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:36:31 np0005539510.novalocal sshd-session[4296]: Connection closed by authenticating user root 92.118.39.92 port 33762 [preauth]
Nov 29 05:37:40 np0005539510.novalocal sshd-session[4298]: Accepted publickey for zuul from 38.102.83.114 port 44568 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 29 05:37:40 np0005539510.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 29 05:37:40 np0005539510.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 29 05:37:40 np0005539510.novalocal systemd-logind[784]: New session 1 of user zuul.
Nov 29 05:37:40 np0005539510.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 29 05:37:40 np0005539510.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Queued start job for default target Main User Target.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Created slice User Application Slice.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Reached target Paths.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Reached target Timers.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Starting D-Bus User Message Bus Socket...
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Starting Create User's Volatile Files and Directories...
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Finished Create User's Volatile Files and Directories.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Listening on D-Bus User Message Bus Socket.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Reached target Sockets.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Reached target Basic System.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Reached target Main User Target.
Nov 29 05:37:40 np0005539510.novalocal systemd[4302]: Startup finished in 117ms.
Nov 29 05:37:40 np0005539510.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 29 05:37:40 np0005539510.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 29 05:37:40 np0005539510.novalocal sshd-session[4298]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:37:41 np0005539510.novalocal python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:37:46 np0005539510.novalocal python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:37:52 np0005539510.novalocal python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:37:53 np0005539510.novalocal python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 29 05:37:55 np0005539510.novalocal python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrxzXgpPmVv8+7+5w1Oy1RsXOPeqdxTcUlq37d0RcYulAAKXWla/qJwAX46v5xh/Mg4GnRpk77lvDWcVnOQjFYQg3OeLmFgDDNPV0YL7URmIe/MvgcqM+Kx7/SQjk+hEt7rUIqkFUjeREX60T5eTEMANFgJrljqZcBTMgYr67x4v7oFELzKuZIO0SCAprJ9NYmdRaC+DsjZjU+DuFdHBnfZCpgkTFMCda2FAS9BneAVOIMCBu5RgNVJXeAgIsPX9GNX3qDJMKOluQLOW++2gbue3S1Nrs1GMPm+IPRD4yWc9eZs1tpR1jdP1BEPBpyQRQlUn4z7BUdEogSzYiXCSmqzN1o/R3mdi16bG8e2lHve5MQFABPko8KsgVOJu0H7b7wGo/oGdXH7sdlKuGoWxWyTFcq3RcVkaVgjKtt6zeswkrpxMUv9/6NXPrhIWqdQm/wVw0Pv2p98yq10QRPyBv5yI8zcNjxueUl3aM8SZML87E6lhkUFFdAuVof+Sl5Pz8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:37:55 np0005539510.novalocal python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:37:56 np0005539510.novalocal python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:37:56 np0005539510.novalocal python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394676.172576-254-274465802037065/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa follow=False checksum=5ac8bea8bfb8f348688bf24843ddb1285b2d351d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:37:57 np0005539510.novalocal python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:37:57 np0005539510.novalocal python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394677.1864204-308-28332985242287/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa.pub follow=False checksum=48b31d706687f3385690285b8caeaea67ea8286c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:37:58 np0005539510.novalocal irqbalance[780]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 29 05:37:58 np0005539510.novalocal irqbalance[780]: IRQ 27 affinity is now unmanaged
Nov 29 05:37:59 np0005539510.novalocal python3[4972]: ansible-ping Invoked with data=pong
Nov 29 05:38:00 np0005539510.novalocal python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:38:02 np0005539510.novalocal python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 29 05:38:03 np0005539510.novalocal python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:03 np0005539510.novalocal python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:03 np0005539510.novalocal python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:05 np0005539510.novalocal python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:05 np0005539510.novalocal python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:05 np0005539510.novalocal python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:07 np0005539510.novalocal sudo[5230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxmqennuqcjkhdkyhivjqypymndfmvwa ; /usr/bin/python3'
Nov 29 05:38:07 np0005539510.novalocal sudo[5230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:07 np0005539510.novalocal python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:07 np0005539510.novalocal sudo[5230]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:07 np0005539510.novalocal sudo[5308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcyohgqejftrnahfjgonnsmmnjsuwzap ; /usr/bin/python3'
Nov 29 05:38:07 np0005539510.novalocal sudo[5308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:08 np0005539510.novalocal python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:08 np0005539510.novalocal sudo[5308]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:08 np0005539510.novalocal sudo[5381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mstzrxrbkzerqdgelzuplhbjsoymqxht ; /usr/bin/python3'
Nov 29 05:38:08 np0005539510.novalocal sudo[5381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:08 np0005539510.novalocal python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394687.5909472-34-181945937682816/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:08 np0005539510.novalocal sudo[5381]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:09 np0005539510.novalocal python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:09 np0005539510.novalocal python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:09 np0005539510.novalocal python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539510.novalocal python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539510.novalocal python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539510.novalocal python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539510.novalocal python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:11 np0005539510.novalocal python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:11 np0005539510.novalocal python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:11 np0005539510.novalocal python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:12 np0005539510.novalocal python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:12 np0005539510.novalocal python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:12 np0005539510.novalocal python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539510.novalocal python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539510.novalocal python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539510.novalocal python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539510.novalocal python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:14 np0005539510.novalocal python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:14 np0005539510.novalocal python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:14 np0005539510.novalocal python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:14 np0005539510.novalocal python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:15 np0005539510.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:15 np0005539510.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:15 np0005539510.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:15 np0005539510.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:16 np0005539510.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:18 np0005539510.novalocal sudo[6055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezjwzkldnjqyrbnfzzslhagimnxoqwza ; /usr/bin/python3'
Nov 29 05:38:18 np0005539510.novalocal sudo[6055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:19 np0005539510.novalocal python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 05:38:19 np0005539510.novalocal systemd[1]: Starting Time & Date Service...
Nov 29 05:38:19 np0005539510.novalocal systemd[1]: Started Time & Date Service.
Nov 29 05:38:19 np0005539510.novalocal systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Nov 29 05:38:19 np0005539510.novalocal sudo[6055]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:19 np0005539510.novalocal sudo[6086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apnoyniwtbimexwjboofcbcjmpfdtscz ; /usr/bin/python3'
Nov 29 05:38:19 np0005539510.novalocal sudo[6086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:19 np0005539510.novalocal python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:19 np0005539510.novalocal sudo[6086]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:20 np0005539510.novalocal python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:20 np0005539510.novalocal python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764394700.0409346-254-112154858688938/source _original_basename=tmpzfxmfq3f follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:21 np0005539510.novalocal python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:21 np0005539510.novalocal python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394701.144427-304-50445399397328/source _original_basename=tmp21_6hg5b follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:22 np0005539510.novalocal sudo[6506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mispyxmnyecfoyimpgzlgloufqcwozfh ; /usr/bin/python3'
Nov 29 05:38:22 np0005539510.novalocal sudo[6506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:22 np0005539510.novalocal python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:22 np0005539510.novalocal sudo[6506]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:22 np0005539510.novalocal sudo[6579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibtlsureawvhgeqvkjzakompwqsczywd ; /usr/bin/python3'
Nov 29 05:38:22 np0005539510.novalocal sudo[6579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:22 np0005539510.novalocal python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394702.42867-384-118647596943086/source _original_basename=tmp9vticgkx follow=False checksum=de28d19618025176a7a65eba0e40c742fe7af9f4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:22 np0005539510.novalocal sudo[6579]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:23 np0005539510.novalocal python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:38:23 np0005539510.novalocal python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:38:24 np0005539510.novalocal sudo[6733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouzleykjusfoyqnmnavxvqwbtnqhkxuz ; /usr/bin/python3'
Nov 29 05:38:24 np0005539510.novalocal sudo[6733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:24 np0005539510.novalocal python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:24 np0005539510.novalocal sudo[6733]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:24 np0005539510.novalocal sudo[6806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkvvgnugwldiryddidtyeokimuofbjll ; /usr/bin/python3'
Nov 29 05:38:24 np0005539510.novalocal sudo[6806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:25 np0005539510.novalocal python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394704.4042313-454-225249419640179/source _original_basename=tmpugphsqpn follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:25 np0005539510.novalocal sudo[6806]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:25 np0005539510.novalocal sudo[6857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srkvowageeboisecvpvnhynlnqhsivgy ; /usr/bin/python3'
Nov 29 05:38:25 np0005539510.novalocal sudo[6857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:25 np0005539510.novalocal python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-3d5b-5bb0-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:38:25 np0005539510.novalocal sudo[6857]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:26 np0005539510.novalocal python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-3d5b-5bb0-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 29 05:38:27 np0005539510.novalocal python3[6915]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:46 np0005539510.novalocal sudo[6939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgwaiqbutumxbuduncmapcsmpzysrzmv ; /usr/bin/python3'
Nov 29 05:38:46 np0005539510.novalocal sudo[6939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:46 np0005539510.novalocal python3[6941]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:46 np0005539510.novalocal sudo[6939]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:48 np0005539510.novalocal sshd-session[6942]: Invalid user test from 92.118.39.92 port 55410
Nov 29 05:38:48 np0005539510.novalocal sshd-session[6942]: Connection closed by invalid user test 92.118.39.92 port 55410 [preauth]
Nov 29 05:38:49 np0005539510.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 05:39:46 np0005539510.novalocal sshd-session[4311]: Received disconnect from 38.102.83.114 port 44568:11: disconnected by user
Nov 29 05:39:46 np0005539510.novalocal sshd-session[4311]: Disconnected from user zuul 38.102.83.114 port 44568
Nov 29 05:39:46 np0005539510.novalocal sshd-session[4298]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:39:46 np0005539510.novalocal systemd-logind[784]: Session 1 logged out. Waiting for processes to exit.
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 29 05:40:17 np0005539510.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 29 05:40:17 np0005539510.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5536] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 05:40:17 np0005539510.novalocal systemd-udevd[6947]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5730] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5754] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5758] device (eth1): carrier: link connected
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5760] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5765] policy: auto-activating connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e)
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5769] device (eth1): Activation: starting connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e)
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5770] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5774] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5778] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:40:17 np0005539510.novalocal NetworkManager[857]: <info>  [1764394817.5782] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:17 np0005539510.novalocal systemd[4302]: Starting Mark boot as successful...
Nov 29 05:40:17 np0005539510.novalocal systemd[4302]: Finished Mark boot as successful.
Nov 29 05:40:18 np0005539510.novalocal sshd-session[6951]: Accepted publickey for zuul from 38.102.83.114 port 55014 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:40:18 np0005539510.novalocal systemd-logind[784]: New session 3 of user zuul.
Nov 29 05:40:18 np0005539510.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 29 05:40:18 np0005539510.novalocal sshd-session[6951]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:40:18 np0005539510.novalocal python3[6978]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4e5a-44df-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:40:21 np0005539510.novalocal sshd-session[6981]: Connection closed by authenticating user root 141.94.154.244 port 36562 [preauth]
Nov 29 05:40:28 np0005539510.novalocal sudo[7058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktwkqrhmddiixjgegmrqatcveezqkfsa ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:40:28 np0005539510.novalocal sudo[7058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:40:28 np0005539510.novalocal python3[7060]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:40:28 np0005539510.novalocal sudo[7058]: pam_unix(sudo:session): session closed for user root
Nov 29 05:40:29 np0005539510.novalocal sudo[7131]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mirfvqvovmwtmbmbsaurazwfvnxqhtcl ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:40:29 np0005539510.novalocal sudo[7131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:40:29 np0005539510.novalocal python3[7133]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394828.6393135-206-59077736288529/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=348c98c735136a2106546cb80073c0e23d947857 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:40:29 np0005539510.novalocal sudo[7131]: pam_unix(sudo:session): session closed for user root
Nov 29 05:40:29 np0005539510.novalocal sudo[7181]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hazaitivkbntmjohefpqzoxrogqretlm ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:40:29 np0005539510.novalocal sudo[7181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:40:29 np0005539510.novalocal python3[7183]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Stopping Network Manager...
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8138] caught SIGTERM, shutting down normally.
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8148] dhcp4 (eth0): canceled DHCP transaction
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8149] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8149] dhcp4 (eth0): state changed no lease
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8152] manager: NetworkManager state is now CONNECTING
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8268] dhcp4 (eth1): canceled DHCP transaction
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8270] dhcp4 (eth1): state changed no lease
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[857]: <info>  [1764394829.8333] exiting (success)
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Stopped Network Manager.
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: NetworkManager.service: Consumed 1.876s CPU time, 9.9M memory peak.
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Starting Network Manager...
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.8797] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:631d2949-c1d4-4f67-afc4-db082a3ff43a)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.8798] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.8842] manager[0x564e29cae070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Starting Hostname Service...
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Started Hostname Service.
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9532] hostname: hostname: using hostnamed
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9532] hostname: static hostname changed from (none) to "np0005539510.novalocal"
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9537] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9541] manager[0x564e29cae070]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9541] manager[0x564e29cae070]: rfkill: WWAN hardware radio set enabled
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9563] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9564] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9564] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9565] manager: Networking is enabled by state file
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9566] settings: Loaded settings plugin: keyfile (internal)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9570] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9589] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9595] dhcp: init: Using DHCP client 'internal'
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9597] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9600] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9604] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9610] device (lo): Activation: starting connection 'lo' (5b6e73d6-4c36-495c-9d49-56d866cbd8e2)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9614] device (eth0): carrier: link connected
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9618] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9621] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9621] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9626] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9632] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9636] device (eth1): carrier: link connected
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9639] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9642] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e) (indicated)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9643] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9646] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9651] device (eth1): Activation: starting connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e)
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Started Network Manager.
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9656] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9658] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9660] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9662] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9663] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9666] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9668] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9670] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9673] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9679] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9682] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9689] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9691] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9707] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9709] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9722] device (lo): Activation: successful, device activated.
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9731] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9737] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9790] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9807] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9809] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9813] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9817] device (eth0): Activation: successful, device activated.
Nov 29 05:40:29 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394829.9823] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 05:40:29 np0005539510.novalocal sudo[7181]: pam_unix(sudo:session): session closed for user root
Nov 29 05:40:29 np0005539510.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 29 05:40:30 np0005539510.novalocal python3[7267]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4e5a-44df-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:40:40 np0005539510.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:40:59 np0005539510.novalocal sshd-session[7270]: Invalid user test from 92.118.39.92 port 48846
Nov 29 05:40:59 np0005539510.novalocal sshd-session[7270]: Connection closed by invalid user test 92.118.39.92 port 48846 [preauth]
Nov 29 05:40:59 np0005539510.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3449] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 05:41:15 np0005539510.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:41:15 np0005539510.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3753] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3755] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3762] device (eth1): Activation: successful, device activated.
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3768] manager: startup complete
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3771] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <warn>  [1764394875.3776] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3782] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3928] dhcp4 (eth1): canceled DHCP transaction
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3929] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3929] dhcp4 (eth1): state changed no lease
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3943] policy: auto-activating connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3947] device (eth1): Activation: starting connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3948] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3951] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3957] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.3968] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.4016] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.4023] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 05:41:15 np0005539510.novalocal NetworkManager[7196]: <info>  [1764394875.4030] device (eth1): Activation: successful, device activated.
Nov 29 05:41:25 np0005539510.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:41:30 np0005539510.novalocal sshd-session[6954]: Received disconnect from 38.102.83.114 port 55014:11: disconnected by user
Nov 29 05:41:30 np0005539510.novalocal sshd-session[6954]: Disconnected from user zuul 38.102.83.114 port 55014
Nov 29 05:41:30 np0005539510.novalocal sshd-session[6951]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:41:30 np0005539510.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 29 05:41:30 np0005539510.novalocal systemd[1]: session-3.scope: Consumed 1.415s CPU time.
Nov 29 05:41:30 np0005539510.novalocal systemd-logind[784]: Session 3 logged out. Waiting for processes to exit.
Nov 29 05:41:30 np0005539510.novalocal systemd-logind[784]: Removed session 3.
Nov 29 05:41:45 np0005539510.novalocal sshd-session[7300]: Accepted publickey for zuul from 38.102.83.114 port 36600 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:41:45 np0005539510.novalocal systemd-logind[784]: New session 4 of user zuul.
Nov 29 05:41:45 np0005539510.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 29 05:41:45 np0005539510.novalocal sshd-session[7300]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:41:45 np0005539510.novalocal sudo[7379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tohwxxmlfubhbalrlzsrixpcbvixczmb ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:41:45 np0005539510.novalocal sudo[7379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:41:46 np0005539510.novalocal python3[7381]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:41:46 np0005539510.novalocal sudo[7379]: pam_unix(sudo:session): session closed for user root
Nov 29 05:41:46 np0005539510.novalocal sudo[7452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-admtxzhzpephftpszarzhebeygvvkosq ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:41:46 np0005539510.novalocal sudo[7452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:41:46 np0005539510.novalocal python3[7454]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394905.7018907-373-256392526151513/source _original_basename=tmp9heeay82 follow=False checksum=95c43167cb69fbe3f3b9eff0c3ecf63c2bbd5b70 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:41:46 np0005539510.novalocal sudo[7452]: pam_unix(sudo:session): session closed for user root
Nov 29 05:41:48 np0005539510.novalocal sshd-session[7303]: Connection closed by 38.102.83.114 port 36600
Nov 29 05:41:48 np0005539510.novalocal sshd-session[7300]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:41:48 np0005539510.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 29 05:41:48 np0005539510.novalocal systemd-logind[784]: Session 4 logged out. Waiting for processes to exit.
Nov 29 05:41:48 np0005539510.novalocal systemd-logind[784]: Removed session 4.
Nov 29 05:43:09 np0005539510.novalocal sshd-session[7480]: Connection closed by authenticating user root 92.118.39.92 port 42266 [preauth]
Nov 29 05:43:39 np0005539510.novalocal systemd[4302]: Created slice User Background Tasks Slice.
Nov 29 05:43:39 np0005539510.novalocal systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 05:43:39 np0005539510.novalocal systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 05:45:20 np0005539510.novalocal sshd-session[7483]: Invalid user user from 92.118.39.92 port 35682
Nov 29 05:45:20 np0005539510.novalocal sshd-session[7483]: Connection closed by invalid user user 92.118.39.92 port 35682 [preauth]
Nov 29 05:47:03 np0005539510.novalocal sshd-session[7487]: Accepted publickey for zuul from 38.102.83.114 port 41004 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:47:03 np0005539510.novalocal systemd-logind[784]: New session 5 of user zuul.
Nov 29 05:47:03 np0005539510.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 29 05:47:03 np0005539510.novalocal sshd-session[7487]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:47:03 np0005539510.novalocal sudo[7514]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byvdjcqnvatxvrddcqkzqzchxwvputtr ; /usr/bin/python3'
Nov 29 05:47:03 np0005539510.novalocal sudo[7514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:03 np0005539510.novalocal python3[7516]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca2-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:03 np0005539510.novalocal sudo[7514]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539510.novalocal sudo[7543]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmxeqtohbfddhaqiehkhjplkucoflsht ; /usr/bin/python3'
Nov 29 05:47:04 np0005539510.novalocal sudo[7543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:04 np0005539510.novalocal python3[7545]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:04 np0005539510.novalocal sudo[7543]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539510.novalocal sudo[7569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubzcyuudrmzjclymtqhqrxwruvviwfia ; /usr/bin/python3'
Nov 29 05:47:04 np0005539510.novalocal sudo[7569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:04 np0005539510.novalocal python3[7571]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:04 np0005539510.novalocal sudo[7569]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539510.novalocal sudo[7595]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwqcqpiopqmiboridsrnxbyxhusdnolo ; /usr/bin/python3'
Nov 29 05:47:04 np0005539510.novalocal sudo[7595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:04 np0005539510.novalocal python3[7597]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:04 np0005539510.novalocal sudo[7595]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539510.novalocal sudo[7621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoqichxduxntxpnfcyvdhvtdomjqfgcn ; /usr/bin/python3'
Nov 29 05:47:04 np0005539510.novalocal sudo[7621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:05 np0005539510.novalocal python3[7623]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:05 np0005539510.novalocal sudo[7621]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:05 np0005539510.novalocal sudo[7647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gprstsmoqsrnyaimeplbizelxccmrtpy ; /usr/bin/python3'
Nov 29 05:47:05 np0005539510.novalocal sudo[7647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:05 np0005539510.novalocal python3[7649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:05 np0005539510.novalocal sudo[7647]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:06 np0005539510.novalocal sudo[7725]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmeozrsgagiyzoymaqvspkbnkzbnelhm ; /usr/bin/python3'
Nov 29 05:47:06 np0005539510.novalocal sudo[7725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:06 np0005539510.novalocal python3[7727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:06 np0005539510.novalocal sudo[7725]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:06 np0005539510.novalocal sudo[7798]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkmmzwamfuksxpowjtvblmnyufidhxjk ; /usr/bin/python3'
Nov 29 05:47:06 np0005539510.novalocal sudo[7798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:06 np0005539510.novalocal python3[7800]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395225.882199-368-163779388183193/source _original_basename=tmpwwfcjsx3 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:06 np0005539510.novalocal sudo[7798]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:07 np0005539510.novalocal sudo[7848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npszdtsmosnfuefqewyxdwlkthxhxsjv ; /usr/bin/python3'
Nov 29 05:47:07 np0005539510.novalocal sudo[7848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:07 np0005539510.novalocal python3[7850]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 05:47:07 np0005539510.novalocal systemd[1]: Reloading.
Nov 29 05:47:07 np0005539510.novalocal systemd-rc-local-generator[7869]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:47:08 np0005539510.novalocal sudo[7848]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:09 np0005539510.novalocal sudo[7903]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifhjqfgvxxnnzpwrhpixfssowvxpgyxf ; /usr/bin/python3'
Nov 29 05:47:09 np0005539510.novalocal sudo[7903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:09 np0005539510.novalocal python3[7905]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 29 05:47:09 np0005539510.novalocal sudo[7903]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:09 np0005539510.novalocal sudo[7929]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yonlxzrknuhnobgcjospjjeydziuefls ; /usr/bin/python3'
Nov 29 05:47:10 np0005539510.novalocal sudo[7929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:10 np0005539510.novalocal python3[7931]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:10 np0005539510.novalocal sudo[7929]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:10 np0005539510.novalocal sudo[7957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxhewuwlmlcyhzkhslojpblhnfrbaquf ; /usr/bin/python3'
Nov 29 05:47:10 np0005539510.novalocal sudo[7957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:10 np0005539510.novalocal python3[7959]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:10 np0005539510.novalocal sudo[7957]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:10 np0005539510.novalocal sudo[7985]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsyzcgduclalclpqynwzrchkbndtttil ; /usr/bin/python3'
Nov 29 05:47:10 np0005539510.novalocal sudo[7985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:10 np0005539510.novalocal python3[7987]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:10 np0005539510.novalocal sudo[7985]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:10 np0005539510.novalocal sudo[8013]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgnhdvtkdctaenyxkvaxigqlzgwncypi ; /usr/bin/python3'
Nov 29 05:47:10 np0005539510.novalocal sudo[8013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:10 np0005539510.novalocal python3[8015]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:11 np0005539510.novalocal sudo[8013]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:11 np0005539510.novalocal python3[8043]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca9-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:12 np0005539510.novalocal python3[8073]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 05:47:15 np0005539510.novalocal sshd-session[7490]: Connection closed by 38.102.83.114 port 41004
Nov 29 05:47:15 np0005539510.novalocal sshd-session[7487]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:47:15 np0005539510.novalocal systemd-logind[784]: Session 5 logged out. Waiting for processes to exit.
Nov 29 05:47:15 np0005539510.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 29 05:47:15 np0005539510.novalocal systemd[1]: session-5.scope: Consumed 4.398s CPU time.
Nov 29 05:47:15 np0005539510.novalocal systemd-logind[784]: Removed session 5.
Nov 29 05:47:16 np0005539510.novalocal sshd-session[8077]: Accepted publickey for zuul from 38.102.83.114 port 56506 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:47:16 np0005539510.novalocal systemd-logind[784]: New session 6 of user zuul.
Nov 29 05:47:16 np0005539510.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 29 05:47:16 np0005539510.novalocal sshd-session[8077]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:47:16 np0005539510.novalocal sudo[8104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcckzojtjvykzzsqpyvarcqyweyhvmyu ; /usr/bin/python3'
Nov 29 05:47:16 np0005539510.novalocal sudo[8104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:17 np0005539510.novalocal python3[8106]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 05:47:24 np0005539510.novalocal sshd-session[8150]: Invalid user admin from 92.118.39.92 port 57338
Nov 29 05:47:25 np0005539510.novalocal sshd-session[8150]: Connection closed by invalid user admin 92.118.39.92 port 57338 [preauth]
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:47:30 np0005539510.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:47:38 np0005539510.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:47:46 np0005539510.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:47:48 np0005539510.novalocal setsebool[8174]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 29 05:47:48 np0005539510.novalocal setsebool[8174]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:47:59 np0005539510.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:48:18 np0005539510.novalocal dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 05:48:18 np0005539510.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 05:48:18 np0005539510.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 29 05:48:18 np0005539510.novalocal systemd[1]: Reloading.
Nov 29 05:48:18 np0005539510.novalocal systemd-rc-local-generator[8931]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:48:18 np0005539510.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 05:48:19 np0005539510.novalocal sudo[8104]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:24 np0005539510.novalocal python3[13473]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-4d52-d96a-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:48:25 np0005539510.novalocal kernel: evm: overlay not supported
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: Starting D-Bus User Message Bus...
Nov 29 05:48:25 np0005539510.novalocal dbus-broker-launch[14032]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 29 05:48:25 np0005539510.novalocal dbus-broker-launch[14032]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: Started D-Bus User Message Bus.
Nov 29 05:48:25 np0005539510.novalocal dbus-broker-lau[14032]: Ready
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: Created slice Slice /user.
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: podman-13957.scope: unit configures an IP firewall, but not running as root.
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: Started podman-13957.scope.
Nov 29 05:48:25 np0005539510.novalocal systemd[4302]: Started podman-pause-164fd77e.scope.
Nov 29 05:48:26 np0005539510.novalocal sudo[14220]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbywghqontvcukjxwziektsxmqhbylse ; /usr/bin/python3'
Nov 29 05:48:26 np0005539510.novalocal sudo[14220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:26 np0005539510.novalocal python3[14237]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.97:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.97:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:48:26 np0005539510.novalocal python3[14237]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 29 05:48:26 np0005539510.novalocal sudo[14220]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:26 np0005539510.novalocal sshd-session[8080]: Connection closed by 38.102.83.114 port 56506
Nov 29 05:48:26 np0005539510.novalocal sshd-session[8077]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:48:26 np0005539510.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Nov 29 05:48:26 np0005539510.novalocal systemd[1]: session-6.scope: Consumed 57.640s CPU time.
Nov 29 05:48:26 np0005539510.novalocal systemd-logind[784]: Session 6 logged out. Waiting for processes to exit.
Nov 29 05:48:26 np0005539510.novalocal systemd-logind[784]: Removed session 6.
Nov 29 05:48:47 np0005539510.novalocal sshd-session[22091]: Connection closed by 38.102.83.107 port 59168 [preauth]
Nov 29 05:48:47 np0005539510.novalocal sshd-session[22092]: Unable to negotiate with 38.102.83.107 port 59170: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 29 05:48:47 np0005539510.novalocal sshd-session[22094]: Unable to negotiate with 38.102.83.107 port 59196: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 29 05:48:47 np0005539510.novalocal sshd-session[22099]: Connection closed by 38.102.83.107 port 59156 [preauth]
Nov 29 05:48:47 np0005539510.novalocal sshd-session[22097]: Unable to negotiate with 38.102.83.107 port 59184: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 29 05:48:52 np0005539510.novalocal sshd-session[23584]: Accepted publickey for zuul from 38.102.83.114 port 43486 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:48:52 np0005539510.novalocal systemd-logind[784]: New session 7 of user zuul.
Nov 29 05:48:52 np0005539510.novalocal systemd[1]: Started Session 7 of User zuul.
Nov 29 05:48:52 np0005539510.novalocal sshd-session[23584]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:48:52 np0005539510.novalocal python3[23690]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:48:53 np0005539510.novalocal sudo[23883]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvhrqnwqxfjbofghuyzkbbkspuqygtxl ; /usr/bin/python3'
Nov 29 05:48:53 np0005539510.novalocal sudo[23883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:53 np0005539510.novalocal python3[23894]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:48:53 np0005539510.novalocal sudo[23883]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:54 np0005539510.novalocal sudo[24277]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndmaukdljwovlvcvotpqctzwxzuympaz ; /usr/bin/python3'
Nov 29 05:48:54 np0005539510.novalocal sudo[24277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:54 np0005539510.novalocal python3[24287]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539510.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 29 05:48:54 np0005539510.novalocal useradd[24358]: new group: name=cloud-admin, GID=1002
Nov 29 05:48:54 np0005539510.novalocal useradd[24358]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 29 05:48:54 np0005539510.novalocal sudo[24277]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:54 np0005539510.novalocal sudo[24530]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iezoinkkxdtrpiedyonhefeujyfntxue ; /usr/bin/python3'
Nov 29 05:48:54 np0005539510.novalocal sudo[24530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:54 np0005539510.novalocal python3[24538]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:48:54 np0005539510.novalocal sudo[24530]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:55 np0005539510.novalocal sudo[24792]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykmavymubwiystqfsojfgfmyjkaehap ; /usr/bin/python3'
Nov 29 05:48:55 np0005539510.novalocal sudo[24792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:55 np0005539510.novalocal python3[24799]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:48:55 np0005539510.novalocal sudo[24792]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:55 np0005539510.novalocal sudo[25061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vctjlyftaijcdkptvatnnchlquepccqe ; /usr/bin/python3'
Nov 29 05:48:55 np0005539510.novalocal sudo[25061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:55 np0005539510.novalocal python3[25068]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395335.0963047-170-205090840395830/source _original_basename=tmpd5cvm_7s follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:48:55 np0005539510.novalocal sudo[25061]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:56 np0005539510.novalocal sudo[25381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gstqtowjukaqinmheeblnxxjuuvcofna ; /usr/bin/python3'
Nov 29 05:48:56 np0005539510.novalocal sudo[25381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:56 np0005539510.novalocal python3[25387]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Nov 29 05:48:56 np0005539510.novalocal systemd[1]: Starting Hostname Service...
Nov 29 05:48:56 np0005539510.novalocal systemd[1]: Started Hostname Service.
Nov 29 05:48:56 np0005539510.novalocal systemd-hostnamed[25496]: Changed pretty hostname to 'compute-2'
Nov 29 05:48:57 compute-2 systemd-hostnamed[25496]: Hostname set to <compute-2> (static)
Nov 29 05:48:57 compute-2 NetworkManager[7196]: <info>  [1764395337.0010] hostname: static hostname changed from "np0005539510.novalocal" to "compute-2"
Nov 29 05:48:57 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:48:57 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:48:57 compute-2 sudo[25381]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:57 compute-2 sshd-session[23630]: Connection closed by 38.102.83.114 port 43486
Nov 29 05:48:57 compute-2 sshd-session[23584]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:48:57 compute-2 systemd[1]: session-7.scope: Deactivated successfully.
Nov 29 05:48:57 compute-2 systemd[1]: session-7.scope: Consumed 2.360s CPU time.
Nov 29 05:48:57 compute-2 systemd-logind[784]: Session 7 logged out. Waiting for processes to exit.
Nov 29 05:48:57 compute-2 systemd-logind[784]: Removed session 7.
Nov 29 05:49:07 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:49:09 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 05:49:09 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 05:49:09 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1min 3.667s CPU time.
Nov 29 05:49:09 compute-2 systemd[1]: run-r74f79890b8ab4be6abf78398fd034a1b.service: Deactivated successfully.
Nov 29 05:49:27 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:49:29 compute-2 sshd-session[29929]: Connection closed by authenticating user root 92.118.39.92 port 50752 [preauth]
Nov 29 05:50:39 compute-2 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 29 05:50:39 compute-2 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 29 05:50:39 compute-2 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 29 05:50:39 compute-2 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 29 05:51:43 compute-2 sshd-session[29936]: Connection closed by authenticating user root 92.118.39.92 port 44176 [preauth]
Nov 29 05:53:01 compute-2 sshd-session[29939]: Accepted publickey for zuul from 38.102.83.107 port 58250 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:53:01 compute-2 systemd-logind[784]: New session 8 of user zuul.
Nov 29 05:53:01 compute-2 systemd[1]: Started Session 8 of User zuul.
Nov 29 05:53:01 compute-2 sshd-session[29939]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:53:02 compute-2 python3[30015]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:53:04 compute-2 sudo[30129]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emwgfhpluxatdtnwfeiivbzokamgixbh ; /usr/bin/python3'
Nov 29 05:53:04 compute-2 sudo[30129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:04 compute-2 python3[30131]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:04 compute-2 sudo[30129]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:04 compute-2 sudo[30202]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxrvsjadwwmpyjizmqqlrqnhxuvkiojb ; /usr/bin/python3'
Nov 29 05:53:04 compute-2 sudo[30202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:04 compute-2 python3[30204]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:04 compute-2 sudo[30202]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:04 compute-2 sudo[30228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oggilqhkqwecwksqxcipisjtflmrrkhe ; /usr/bin/python3'
Nov 29 05:53:04 compute-2 sudo[30228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:05 compute-2 python3[30230]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:05 compute-2 sudo[30228]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:05 compute-2 sudo[30301]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvptnpzdrfohxiacpytxcpqnpuzyqiwn ; /usr/bin/python3'
Nov 29 05:53:05 compute-2 sudo[30301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:05 compute-2 python3[30303]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:05 compute-2 sudo[30301]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:05 compute-2 sudo[30327]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpdttiqnniyxcccxwkgqlgfitjiucfas ; /usr/bin/python3'
Nov 29 05:53:05 compute-2 sudo[30327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:05 compute-2 python3[30329]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:05 compute-2 sudo[30327]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:05 compute-2 sudo[30400]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtmslyammfwtdfqytarvophvqyyfnmfd ; /usr/bin/python3'
Nov 29 05:53:05 compute-2 sudo[30400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:06 compute-2 python3[30402]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:06 compute-2 sudo[30400]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:06 compute-2 sudo[30426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlkqepghahflztzhyqtosqabhozonnpe ; /usr/bin/python3'
Nov 29 05:53:06 compute-2 sudo[30426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:06 compute-2 python3[30428]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:06 compute-2 sudo[30426]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:06 compute-2 sudo[30499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcfucnbnlsldrhfeitionzjqmkrxyfxd ; /usr/bin/python3'
Nov 29 05:53:06 compute-2 sudo[30499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:06 compute-2 python3[30501]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:06 compute-2 sudo[30499]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:06 compute-2 sudo[30525]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ophfcjeaguyrlqddxaebuadlvaladnhw ; /usr/bin/python3'
Nov 29 05:53:06 compute-2 sudo[30525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:07 compute-2 python3[30527]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:07 compute-2 sudo[30525]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:07 compute-2 sudo[30598]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arjojvxnnykhstozetxqslvsszkclogj ; /usr/bin/python3'
Nov 29 05:53:07 compute-2 sudo[30598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:07 compute-2 python3[30600]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:07 compute-2 sudo[30598]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:07 compute-2 sudo[30624]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikbhpywmtlwimqfbjjwmymyifduivmcc ; /usr/bin/python3'
Nov 29 05:53:07 compute-2 sudo[30624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:07 compute-2 python3[30626]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:07 compute-2 sudo[30624]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:07 compute-2 sudo[30697]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kscicjkqqskzbqstthayhvxzpedepmfg ; /usr/bin/python3'
Nov 29 05:53:07 compute-2 sudo[30697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:08 compute-2 python3[30699]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:08 compute-2 sudo[30697]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:08 compute-2 sudo[30723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izhywbhutxdfsiunjaodltmfsluepcgg ; /usr/bin/python3'
Nov 29 05:53:08 compute-2 sudo[30723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:08 compute-2 python3[30725]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:08 compute-2 sudo[30723]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:08 compute-2 sudo[30796]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgzjzldhxwxxkicdvzhynqdqxljoblib ; /usr/bin/python3'
Nov 29 05:53:08 compute-2 sudo[30796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:08 compute-2 python3[30798]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:08 compute-2 sudo[30796]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:20 compute-2 python3[30846]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:53:55 compute-2 sshd-session[30848]: Invalid user guest from 92.118.39.92 port 37624
Nov 29 05:53:55 compute-2 sshd-session[30848]: Connection closed by invalid user guest 92.118.39.92 port 37624 [preauth]
Nov 29 05:56:06 compute-2 sshd-session[30852]: Invalid user ansible from 92.118.39.92 port 59278
Nov 29 05:56:06 compute-2 sshd-session[30852]: Connection closed by invalid user ansible 92.118.39.92 port 59278 [preauth]
Nov 29 05:58:21 compute-2 sshd-session[29942]: Received disconnect from 38.102.83.107 port 58250:11: disconnected by user
Nov 29 05:58:21 compute-2 sshd-session[29942]: Disconnected from user zuul 38.102.83.107 port 58250
Nov 29 05:58:21 compute-2 sshd-session[29939]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:58:21 compute-2 systemd-logind[784]: Session 8 logged out. Waiting for processes to exit.
Nov 29 05:58:21 compute-2 systemd[1]: session-8.scope: Deactivated successfully.
Nov 29 05:58:21 compute-2 systemd[1]: session-8.scope: Consumed 5.607s CPU time.
Nov 29 05:58:21 compute-2 systemd-logind[784]: Removed session 8.
Nov 29 05:58:21 compute-2 sshd-session[30855]: Invalid user uftp from 92.118.39.92 port 52690
Nov 29 05:58:21 compute-2 sshd-session[30855]: Connection closed by invalid user uftp 92.118.39.92 port 52690 [preauth]
Nov 29 06:00:27 compute-2 sshd-session[30857]: Invalid user uftp from 92.118.39.92 port 46112
Nov 29 06:00:27 compute-2 sshd-session[30857]: Connection closed by invalid user uftp 92.118.39.92 port 46112 [preauth]
Nov 29 06:01:01 compute-2 CROND[30862]: (root) CMD (run-parts /etc/cron.hourly)
Nov 29 06:01:01 compute-2 run-parts[30865]: (/etc/cron.hourly) starting 0anacron
Nov 29 06:01:01 compute-2 anacron[30873]: Anacron started on 2025-11-29
Nov 29 06:01:01 compute-2 anacron[30873]: Will run job `cron.daily' in 27 min.
Nov 29 06:01:01 compute-2 anacron[30873]: Will run job `cron.weekly' in 47 min.
Nov 29 06:01:01 compute-2 anacron[30873]: Will run job `cron.monthly' in 67 min.
Nov 29 06:01:01 compute-2 anacron[30873]: Jobs will be executed sequentially
Nov 29 06:01:01 compute-2 run-parts[30875]: (/etc/cron.hourly) finished 0anacron
Nov 29 06:01:01 compute-2 CROND[30861]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 29 06:02:33 compute-2 sshd-session[30876]: Invalid user svn from 92.118.39.92 port 39532
Nov 29 06:02:33 compute-2 sshd-session[30876]: Connection closed by invalid user svn 92.118.39.92 port 39532 [preauth]
Nov 29 06:04:45 compute-2 sshd-session[30880]: Invalid user git from 92.118.39.92 port 32958
Nov 29 06:04:45 compute-2 sshd-session[30880]: Connection closed by invalid user git 92.118.39.92 port 32958 [preauth]
Nov 29 06:06:27 compute-2 sshd-session[30882]: Accepted publickey for zuul from 192.168.122.30 port 49322 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:06:27 compute-2 systemd-logind[784]: New session 9 of user zuul.
Nov 29 06:06:27 compute-2 systemd[1]: Started Session 9 of User zuul.
Nov 29 06:06:27 compute-2 sshd-session[30882]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:06:28 compute-2 python3.9[31035]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:06:29 compute-2 sudo[31214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzjdrtwsakejznqxqabdveormmamuatm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396388.8548994-64-267062415519458/AnsiballZ_command.py'
Nov 29 06:06:29 compute-2 sudo[31214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:06:29 compute-2 python3.9[31216]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:06:36 compute-2 sudo[31214]: pam_unix(sudo:session): session closed for user root
Nov 29 06:06:38 compute-2 sshd-session[30885]: Connection closed by 192.168.122.30 port 49322
Nov 29 06:06:38 compute-2 sshd-session[30882]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:06:38 compute-2 systemd[1]: session-9.scope: Deactivated successfully.
Nov 29 06:06:38 compute-2 systemd[1]: session-9.scope: Consumed 7.719s CPU time.
Nov 29 06:06:38 compute-2 systemd-logind[784]: Session 9 logged out. Waiting for processes to exit.
Nov 29 06:06:38 compute-2 systemd-logind[784]: Removed session 9.
Nov 29 06:06:51 compute-2 sshd-session[31277]: Invalid user git from 92.118.39.92 port 54620
Nov 29 06:06:51 compute-2 sshd-session[31277]: Connection closed by invalid user git 92.118.39.92 port 54620 [preauth]
Nov 29 06:06:54 compute-2 sshd-session[31279]: Accepted publickey for zuul from 192.168.122.30 port 44050 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:06:54 compute-2 systemd-logind[784]: New session 10 of user zuul.
Nov 29 06:06:54 compute-2 systemd[1]: Started Session 10 of User zuul.
Nov 29 06:06:54 compute-2 sshd-session[31279]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:06:55 compute-2 python3.9[31432]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 06:06:56 compute-2 python3.9[31606]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:06:57 compute-2 sudo[31756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlntqpkxkcbrhnlioqokzexfsepjmzpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396417.239094-100-226573263380128/AnsiballZ_command.py'
Nov 29 06:06:57 compute-2 sudo[31756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:06:57 compute-2 python3.9[31758]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:06:57 compute-2 sudo[31756]: pam_unix(sudo:session): session closed for user root
Nov 29 06:06:59 compute-2 sudo[31909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqbyugtugyrlkhkvnlpjuikuaouifdfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396418.5054705-136-158283189314494/AnsiballZ_stat.py'
Nov 29 06:06:59 compute-2 sudo[31909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:06:59 compute-2 python3.9[31911]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:06:59 compute-2 sudo[31909]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:00 compute-2 sudo[32061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbxavanpporsjginsicxugqggtzodcmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396419.5055292-161-238954038433622/AnsiballZ_file.py'
Nov 29 06:07:00 compute-2 sudo[32061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:00 compute-2 python3.9[32063]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:07:00 compute-2 sudo[32061]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:01 compute-2 sudo[32213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pufatnqgxufnnkebskrivodxbbsykrax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396420.8783379-184-190824286794721/AnsiballZ_stat.py'
Nov 29 06:07:01 compute-2 sudo[32213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:01 compute-2 python3.9[32215]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:07:01 compute-2 sudo[32213]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:02 compute-2 sudo[32336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fopxbliglymgvrbesdfwylzxhtpgantd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396420.8783379-184-190824286794721/AnsiballZ_copy.py'
Nov 29 06:07:02 compute-2 sudo[32336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:02 compute-2 python3.9[32338]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396420.8783379-184-190824286794721/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:07:02 compute-2 sudo[32336]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:02 compute-2 sudo[32488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-venphebgkfxtrquhiwuecwfkzaxlhsla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396422.4610357-229-13293674136547/AnsiballZ_setup.py'
Nov 29 06:07:02 compute-2 sudo[32488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:03 compute-2 python3.9[32490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:07:03 compute-2 sudo[32488]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:03 compute-2 sudo[32644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgkqoidpyzotspxdpxusnldzyqkpennx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396423.6081936-253-160118912914850/AnsiballZ_file.py'
Nov 29 06:07:03 compute-2 sudo[32644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:04 compute-2 python3.9[32646]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:07:04 compute-2 sudo[32644]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:04 compute-2 sudo[32796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmctjpykgbjtnltlxihfucefmiblswur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396424.5462775-280-181076521096218/AnsiballZ_file.py'
Nov 29 06:07:04 compute-2 sudo[32796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:05 compute-2 python3.9[32798]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:07:05 compute-2 sudo[32796]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:05 compute-2 python3.9[32948]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:07:09 compute-2 python3.9[33201]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:07:10 compute-2 python3.9[33351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:07:12 compute-2 python3.9[33505]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:07:13 compute-2 sudo[33661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgelyuyexxomodcljfhsbnjfhuuoptcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396432.8014863-425-18664590124746/AnsiballZ_setup.py'
Nov 29 06:07:13 compute-2 sudo[33661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:13 compute-2 python3.9[33663]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:07:13 compute-2 sudo[33661]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:14 compute-2 sudo[33745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ritvsshtaueoylklcswykhqwojnjkmsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396432.8014863-425-18664590124746/AnsiballZ_dnf.py'
Nov 29 06:07:14 compute-2 sudo[33745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:14 compute-2 python3.9[33747]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:07:56 compute-2 systemd[1]: Reloading.
Nov 29 06:07:56 compute-2 systemd-rc-local-generator[33945]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:07:56 compute-2 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 29 06:07:56 compute-2 systemd[1]: Reloading.
Nov 29 06:07:56 compute-2 systemd-rc-local-generator[33983]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:07:57 compute-2 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 29 06:07:57 compute-2 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 29 06:07:57 compute-2 systemd[1]: Reloading.
Nov 29 06:07:57 compute-2 systemd-rc-local-generator[34022]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:07:57 compute-2 systemd[1]: Starting dnf makecache...
Nov 29 06:07:57 compute-2 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 29 06:07:57 compute-2 dnf[34032]: Failed determining last makecache time.
Nov 29 06:07:57 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 06:07:57 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-barbican-42b4c41831408a8e323 155 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 188 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-cinder-1c00d6490d88e436f26ef 190 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-python-stevedore-c4acc5639fd2329372142 190 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-python-cloudkitty-tests-tempest-2c80f8 197 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-os-net-config-9758ab42364673d01bc5014e 193 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 207 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-python-designate-tests-tempest-347fdbc 208 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-glance-1fd12c29b339f30fe823e 189 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 185 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-manila-3c01b7181572c95dac462 186 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-python-whitebox-neutron-tests-tempest- 191 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-octavia-ba397f07a7331190208c 183 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-watcher-c014f81a8647287f6dcc 194 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-python-tcib-1124124ec06aadbac34f0d340b 175 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 161 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-swift-dc98a8463506ac520c469a 161 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-python-tempestconf-8515371b7cceebd4282 173 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: delorean-openstack-heat-ui-013accbfd179753bc3f0 181 kB/s | 3.0 kB     00:00
Nov 29 06:07:57 compute-2 dnf[34032]: CentOS Stream 9 - BaseOS                         79 kB/s | 7.3 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: CentOS Stream 9 - AppStream                      32 kB/s | 7.4 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: CentOS Stream 9 - CRB                            31 kB/s | 7.2 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: CentOS Stream 9 - Extras packages                74 kB/s | 8.3 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: dlrn-antelope-testing                           138 kB/s | 3.0 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: dlrn-antelope-build-deps                        135 kB/s | 3.0 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: centos9-rabbitmq                                107 kB/s | 3.0 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: centos9-storage                                 116 kB/s | 3.0 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: centos9-opstools                                142 kB/s | 3.0 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: NFV SIG OpenvSwitch                             150 kB/s | 3.0 kB     00:00
Nov 29 06:07:58 compute-2 dnf[34032]: repo-setup-centos-appstream                     192 kB/s | 4.4 kB     00:00
Nov 29 06:07:59 compute-2 dnf[34032]: repo-setup-centos-baseos                         97 kB/s | 3.9 kB     00:00
Nov 29 06:07:59 compute-2 dnf[34032]: repo-setup-centos-highavailability              183 kB/s | 3.9 kB     00:00
Nov 29 06:07:59 compute-2 dnf[34032]: repo-setup-centos-powertools                    193 kB/s | 4.3 kB     00:00
Nov 29 06:07:59 compute-2 dnf[34032]: Extra Packages for Enterprise Linux 9 - x86_64  109 kB/s |  33 kB     00:00
Nov 29 06:08:00 compute-2 dnf[34032]: Metadata cache created.
Nov 29 06:08:00 compute-2 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 06:08:00 compute-2 systemd[1]: Finished dnf makecache.
Nov 29 06:08:00 compute-2 systemd[1]: dnf-makecache.service: Consumed 1.854s CPU time.
Nov 29 06:09:02 compute-2 sshd-session[34302]: Connection closed by authenticating user root 92.118.39.92 port 48036 [preauth]
Nov 29 06:09:06 compute-2 kernel: SELinux:  Converting 2718 SID table entries...
Nov 29 06:09:06 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:09:06 compute-2 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:09:06 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:09:06 compute-2 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:09:06 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:09:06 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:09:06 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:09:07 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 29 06:09:07 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:09:07 compute-2 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:09:07 compute-2 systemd[1]: Reloading.
Nov 29 06:09:08 compute-2 systemd-rc-local-generator[34430]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:09:08 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:09:10 compute-2 sudo[33745]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:11 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:09:11 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:09:11 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.506s CPU time.
Nov 29 06:09:11 compute-2 systemd[1]: run-re5c9407fe725444398094d54ef3c8658.service: Deactivated successfully.
Nov 29 06:09:18 compute-2 sudo[35341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scqsjpcwwkpzzwtcoawfwmmunmksvgcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396556.2366216-461-48291568927940/AnsiballZ_command.py'
Nov 29 06:09:18 compute-2 sudo[35341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:18 compute-2 python3.9[35343]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:09:19 compute-2 sudo[35341]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:20 compute-2 sudo[35622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uujxwjutprhtwxygnhrqoaqtqljxsbyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396560.0041065-485-240580094422596/AnsiballZ_selinux.py'
Nov 29 06:09:20 compute-2 sudo[35622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:21 compute-2 python3.9[35624]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 06:09:21 compute-2 sudo[35622]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:22 compute-2 sudo[35774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbuqwzyjliebbxtrefvlhjkuomlzbmkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396562.276451-518-65799632530057/AnsiballZ_command.py'
Nov 29 06:09:22 compute-2 sudo[35774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:22 compute-2 python3.9[35776]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 06:09:23 compute-2 sudo[35774]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:25 compute-2 sudo[35927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rofwonkyknwkmqqzlqirbnkertusiefx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396565.1519957-541-26351563702641/AnsiballZ_file.py'
Nov 29 06:09:25 compute-2 sudo[35927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:25 compute-2 python3.9[35929]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:09:25 compute-2 sudo[35927]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:26 compute-2 sudo[36079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dilgmbtiwsmhzfgjlhprczfbkxmdkjuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396566.3655229-566-170498548583643/AnsiballZ_mount.py'
Nov 29 06:09:26 compute-2 sudo[36079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:27 compute-2 python3.9[36081]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 06:09:27 compute-2 sudo[36079]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:29 compute-2 sudo[36231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsckdgxpwusvewrhrnuzqamveruxbwvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396569.4767191-650-190756550263441/AnsiballZ_file.py'
Nov 29 06:09:29 compute-2 sudo[36231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:32 compute-2 python3.9[36233]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:32 compute-2 sudo[36231]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:32 compute-2 sudo[36383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmwkmtmuuatsjucxjqcxsbyxmpracaqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396572.4166644-673-168308001233907/AnsiballZ_stat.py'
Nov 29 06:09:32 compute-2 sudo[36383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:35 compute-2 python3.9[36385]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:09:35 compute-2 sudo[36383]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:35 compute-2 sudo[36506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clsopirkcrogyndoqcskdjfgabncxnpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396572.4166644-673-168308001233907/AnsiballZ_copy.py'
Nov 29 06:09:35 compute-2 sudo[36506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:35 compute-2 python3.9[36508]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396572.4166644-673-168308001233907/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:09:35 compute-2 sudo[36506]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:38 compute-2 sudo[36658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beamzssayiszjbgzhtymvdliilgbhxhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396578.3369312-746-247007883305674/AnsiballZ_stat.py'
Nov 29 06:09:38 compute-2 sudo[36658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:38 compute-2 python3.9[36660]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:09:38 compute-2 sudo[36658]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:39 compute-2 sudo[36810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cltjduqpopeobyulfaqcykkhcfbokdjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396579.1415672-770-110781790586245/AnsiballZ_command.py'
Nov 29 06:09:39 compute-2 sudo[36810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:39 compute-2 python3.9[36812]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:09:39 compute-2 sudo[36810]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:40 compute-2 sudo[36963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pacvsjqecodjieekgtdnkdeqzqjfhqjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396580.1055505-794-268898790331028/AnsiballZ_file.py'
Nov 29 06:09:40 compute-2 sudo[36963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:40 compute-2 python3.9[36965]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:09:40 compute-2 sudo[36963]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:41 compute-2 sudo[37115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znsvproebueymswfzmdduecmblrpcjtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396581.234895-826-233369987719659/AnsiballZ_getent.py'
Nov 29 06:09:41 compute-2 sudo[37115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:41 compute-2 python3.9[37117]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 06:09:41 compute-2 sudo[37115]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:41 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:09:42 compute-2 sudo[37269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsnjfycwehuovhmugdyjipnuedjgbfpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396582.255769-851-60158904304177/AnsiballZ_group.py'
Nov 29 06:09:42 compute-2 sudo[37269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:43 compute-2 python3.9[37271]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:09:43 compute-2 groupadd[37272]: group added to /etc/group: name=qemu, GID=107
Nov 29 06:09:43 compute-2 groupadd[37272]: group added to /etc/gshadow: name=qemu
Nov 29 06:09:43 compute-2 groupadd[37272]: new group: name=qemu, GID=107
Nov 29 06:09:43 compute-2 sudo[37269]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:44 compute-2 sudo[37427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obcmaziwtxmgotuqegzgsdteyoiccjhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396583.5418756-875-231995581638718/AnsiballZ_user.py'
Nov 29 06:09:44 compute-2 sudo[37427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:44 compute-2 python3.9[37429]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:09:44 compute-2 useradd[37431]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:09:44 compute-2 sudo[37427]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:45 compute-2 sudo[37587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivhvuvjsomdlmdjccqxsymohifchsvnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396585.0139172-899-206403851018057/AnsiballZ_getent.py'
Nov 29 06:09:45 compute-2 sudo[37587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:45 compute-2 python3.9[37589]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 06:09:45 compute-2 sudo[37587]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:46 compute-2 sudo[37740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvujohwtplxbsnmjemnkjjucfonphwjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396585.8284662-922-230753858528403/AnsiballZ_group.py'
Nov 29 06:09:46 compute-2 sudo[37740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:46 compute-2 python3.9[37742]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:09:46 compute-2 groupadd[37743]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 29 06:09:46 compute-2 groupadd[37743]: group added to /etc/gshadow: name=hugetlbfs
Nov 29 06:09:46 compute-2 groupadd[37743]: new group: name=hugetlbfs, GID=42477
Nov 29 06:09:46 compute-2 sudo[37740]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:47 compute-2 sudo[37898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrdtxlkzaeikanjknqimcfvaxaxvykfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396586.7846177-950-142338321118404/AnsiballZ_file.py'
Nov 29 06:09:47 compute-2 sudo[37898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:47 compute-2 python3.9[37900]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 06:09:47 compute-2 sudo[37898]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:48 compute-2 sudo[38050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlplgqyodgpkyoajnghdoqlcylrmdoeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396587.857137-982-2197483566793/AnsiballZ_dnf.py'
Nov 29 06:09:48 compute-2 sudo[38050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:48 compute-2 python3.9[38052]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:09:50 compute-2 sudo[38050]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:51 compute-2 sudo[38203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebfbvoyvjjrwnwjpslcvnbayjghivkpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396590.6307201-1006-75810267470917/AnsiballZ_file.py'
Nov 29 06:09:51 compute-2 sudo[38203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:51 compute-2 python3.9[38205]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:51 compute-2 sudo[38203]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:51 compute-2 sudo[38355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrxvleaopmyqwozdibxaadjzxgicmimv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396591.5275009-1031-149692483657493/AnsiballZ_stat.py'
Nov 29 06:09:51 compute-2 sudo[38355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:52 compute-2 python3.9[38357]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:09:52 compute-2 sudo[38355]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:52 compute-2 sudo[38478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vonbpkqqqkmhzevnvcxixotfyitpbrzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396591.5275009-1031-149692483657493/AnsiballZ_copy.py'
Nov 29 06:09:52 compute-2 sudo[38478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:52 compute-2 python3.9[38480]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396591.5275009-1031-149692483657493/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:52 compute-2 sudo[38478]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:53 compute-2 sudo[38630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkeesvhcgigewwhvxkbrgahpgahnjvor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396593.0627453-1076-148311927528114/AnsiballZ_systemd.py'
Nov 29 06:09:53 compute-2 sudo[38630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:53 compute-2 python3.9[38632]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:09:53 compute-2 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:09:53 compute-2 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 29 06:09:53 compute-2 kernel: Bridge firewalling registered
Nov 29 06:09:53 compute-2 systemd-modules-load[38636]: Inserted module 'br_netfilter'
Nov 29 06:09:53 compute-2 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:09:54 compute-2 sudo[38630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:54 compute-2 sudo[38790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuevrvvygayoidwyvwarrmakglwpmgua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396594.6192741-1100-201496844374221/AnsiballZ_stat.py'
Nov 29 06:09:54 compute-2 sudo[38790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:55 compute-2 python3.9[38792]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:09:55 compute-2 sudo[38790]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:55 compute-2 sudo[38913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnzazcthcgidsvlldvldidpablankqbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396594.6192741-1100-201496844374221/AnsiballZ_copy.py'
Nov 29 06:09:55 compute-2 sudo[38913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:55 compute-2 python3.9[38915]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396594.6192741-1100-201496844374221/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:55 compute-2 sudo[38913]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:56 compute-2 sudo[39065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hozwggrgvvsxhmuveifwlkiwdkzugyxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396596.3762555-1154-276603485045289/AnsiballZ_dnf.py'
Nov 29 06:09:56 compute-2 sudo[39065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:56 compute-2 python3.9[39067]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:10:00 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 06:10:00 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 06:10:01 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:10:01 compute-2 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:10:01 compute-2 systemd[1]: Reloading.
Nov 29 06:10:01 compute-2 systemd-rc-local-generator[39129]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:01 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:10:02 compute-2 sudo[39065]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:04 compute-2 python3.9[42269]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:10:05 compute-2 python3.9[42956]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 06:10:05 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:10:05 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:10:05 compute-2 systemd[1]: man-db-cache-update.service: Consumed 5.617s CPU time.
Nov 29 06:10:05 compute-2 systemd[1]: run-r02a5de2f542640ad8d211521bd77735b.service: Deactivated successfully.
Nov 29 06:10:06 compute-2 python3.9[43107]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:10:07 compute-2 sudo[43257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwqcgszfdryghoforoznfckbpamysdfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396606.8756595-1271-250935695370287/AnsiballZ_command.py'
Nov 29 06:10:07 compute-2 sudo[43257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:07 compute-2 python3.9[43259]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:07 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:10:08 compute-2 systemd[1]: Starting Authorization Manager...
Nov 29 06:10:08 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:10:08 compute-2 polkitd[43476]: Started polkitd version 0.117
Nov 29 06:10:08 compute-2 polkitd[43476]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:10:08 compute-2 polkitd[43476]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:10:08 compute-2 polkitd[43476]: Finished loading, compiling and executing 2 rules
Nov 29 06:10:08 compute-2 systemd[1]: Started Authorization Manager.
Nov 29 06:10:08 compute-2 polkitd[43476]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 29 06:10:08 compute-2 sudo[43257]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:08 compute-2 sudo[43644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgkzpbxgvsmlogncgjguolnsliyabcvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396608.6242714-1298-272312609217948/AnsiballZ_systemd.py'
Nov 29 06:10:08 compute-2 sudo[43644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:09 compute-2 python3.9[43646]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:10:09 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 06:10:09 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 06:10:09 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 06:10:09 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:10:09 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:10:09 compute-2 sudo[43644]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:10 compute-2 python3.9[43807]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 06:10:14 compute-2 sudo[43957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zysvlfmotsjxoafqwroawodgrucezsaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396614.0361643-1468-118358275822637/AnsiballZ_systemd.py'
Nov 29 06:10:14 compute-2 sudo[43957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:14 compute-2 python3.9[43959]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:10:14 compute-2 systemd[1]: Reloading.
Nov 29 06:10:14 compute-2 systemd-rc-local-generator[43988]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:14 compute-2 sudo[43957]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:15 compute-2 sudo[44145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppbsxltjzcpmlnmwewhdpzlpaojbkurn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396615.1391451-1468-278664098316094/AnsiballZ_systemd.py'
Nov 29 06:10:15 compute-2 sudo[44145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:15 compute-2 python3.9[44147]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:10:15 compute-2 systemd[1]: Reloading.
Nov 29 06:10:15 compute-2 systemd-rc-local-generator[44176]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:16 compute-2 sudo[44145]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:16 compute-2 sudo[44333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnkymkpgzdapxgngunakpipzlppdgzef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396616.514355-1516-96083362339149/AnsiballZ_command.py'
Nov 29 06:10:16 compute-2 sudo[44333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:17 compute-2 python3.9[44335]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:17 compute-2 sudo[44333]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:17 compute-2 sudo[44486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjigynsemcietftsqkeqsqbgnquyaucv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396617.403402-1541-83635219510254/AnsiballZ_command.py'
Nov 29 06:10:17 compute-2 sudo[44486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:17 compute-2 python3.9[44488]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:17 compute-2 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 29 06:10:17 compute-2 sudo[44486]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:18 compute-2 sudo[44639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbslvdgjnhocrismkyutmqupyzcxxyxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396618.2455556-1565-125850718936496/AnsiballZ_command.py'
Nov 29 06:10:18 compute-2 sudo[44639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:18 compute-2 python3.9[44641]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:20 compute-2 sudo[44639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:20 compute-2 sudo[44801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiipimyqxukpgriyygcblqnwojhpzrau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396620.558247-1589-239710933001841/AnsiballZ_command.py'
Nov 29 06:10:20 compute-2 sudo[44801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:21 compute-2 python3.9[44803]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:21 compute-2 sudo[44801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:21 compute-2 sudo[44954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxyejlkfhcmfbgrxvpbqibspnvgcfcpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396621.5454462-1613-79347428545818/AnsiballZ_systemd.py'
Nov 29 06:10:21 compute-2 sudo[44954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:22 compute-2 python3.9[44956]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:10:22 compute-2 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 06:10:22 compute-2 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 06:10:22 compute-2 systemd[1]: Stopping Apply Kernel Variables...
Nov 29 06:10:22 compute-2 systemd[1]: Starting Apply Kernel Variables...
Nov 29 06:10:22 compute-2 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 06:10:22 compute-2 systemd[1]: Finished Apply Kernel Variables.
Nov 29 06:10:22 compute-2 sudo[44954]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:22 compute-2 sshd-session[31282]: Connection closed by 192.168.122.30 port 44050
Nov 29 06:10:22 compute-2 sshd-session[31279]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:10:22 compute-2 systemd[1]: session-10.scope: Deactivated successfully.
Nov 29 06:10:22 compute-2 systemd[1]: session-10.scope: Consumed 2min 17.185s CPU time.
Nov 29 06:10:22 compute-2 systemd-logind[784]: Session 10 logged out. Waiting for processes to exit.
Nov 29 06:10:22 compute-2 systemd-logind[784]: Removed session 10.
Nov 29 06:10:28 compute-2 sshd-session[44986]: Accepted publickey for zuul from 192.168.122.30 port 46182 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:10:28 compute-2 systemd-logind[784]: New session 11 of user zuul.
Nov 29 06:10:29 compute-2 systemd[1]: Started Session 11 of User zuul.
Nov 29 06:10:29 compute-2 sshd-session[44986]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:10:30 compute-2 python3.9[45139]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:10:31 compute-2 sudo[45293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgpcgchfbmqppijkdkqjzjemdsgbjqsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396631.2802088-75-212412168812519/AnsiballZ_getent.py'
Nov 29 06:10:31 compute-2 sudo[45293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:31 compute-2 python3.9[45295]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 06:10:31 compute-2 sudo[45293]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:32 compute-2 sudo[45446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqacubfleiyqqnnspytrhsalxvbfcexs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396632.2213879-99-126850648778529/AnsiballZ_group.py'
Nov 29 06:10:32 compute-2 sudo[45446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:32 compute-2 python3.9[45448]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:10:32 compute-2 groupadd[45449]: group added to /etc/group: name=openvswitch, GID=42476
Nov 29 06:10:32 compute-2 groupadd[45449]: group added to /etc/gshadow: name=openvswitch
Nov 29 06:10:32 compute-2 groupadd[45449]: new group: name=openvswitch, GID=42476
Nov 29 06:10:32 compute-2 sudo[45446]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:34 compute-2 sudo[45604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctnyecrzdcgweuorjcxbjdedrcgihyfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396633.5974655-123-205101332999198/AnsiballZ_user.py'
Nov 29 06:10:34 compute-2 sudo[45604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:34 compute-2 python3.9[45606]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:10:34 compute-2 useradd[45608]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:10:34 compute-2 useradd[45608]: add 'openvswitch' to group 'hugetlbfs'
Nov 29 06:10:34 compute-2 useradd[45608]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 29 06:10:34 compute-2 sudo[45604]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:35 compute-2 sudo[45764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsecofvcprxyayrwpxolgyjnadrxrsru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396635.1101544-153-160924611111469/AnsiballZ_setup.py'
Nov 29 06:10:35 compute-2 sudo[45764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:35 compute-2 python3.9[45766]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:10:35 compute-2 sudo[45764]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:36 compute-2 sudo[45848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtlhafpnlguanpdjfqhmhknknfdhakpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396635.1101544-153-160924611111469/AnsiballZ_dnf.py'
Nov 29 06:10:36 compute-2 sudo[45848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:36 compute-2 python3.9[45850]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:10:40 compute-2 sudo[45848]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:40 compute-2 sudo[46013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzryevmunfybukcasfletrmgkldohhmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396640.3477428-195-84060983970714/AnsiballZ_dnf.py'
Nov 29 06:10:40 compute-2 sudo[46013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:40 compute-2 python3.9[46015]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:10:51 compute-2 kernel: SELinux:  Converting 2730 SID table entries...
Nov 29 06:10:51 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:10:51 compute-2 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:10:51 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:10:51 compute-2 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:10:51 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:10:51 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:10:51 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:10:51 compute-2 groupadd[46038]: group added to /etc/group: name=unbound, GID=993
Nov 29 06:10:51 compute-2 groupadd[46038]: group added to /etc/gshadow: name=unbound
Nov 29 06:10:51 compute-2 groupadd[46038]: new group: name=unbound, GID=993
Nov 29 06:10:51 compute-2 useradd[46045]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 29 06:10:52 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 29 06:10:52 compute-2 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 29 06:10:53 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:10:53 compute-2 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:10:53 compute-2 systemd[1]: Reloading.
Nov 29 06:10:53 compute-2 systemd-sysv-generator[46546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:10:53 compute-2 systemd-rc-local-generator[46543]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:53 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:10:54 compute-2 sudo[46013]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:54 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:10:54 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:10:54 compute-2 systemd[1]: run-ra3d75904055f4de0b1e2a4218b562bf6.service: Deactivated successfully.
Nov 29 06:10:58 compute-2 sudo[47110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkvhkrbmutuxjyllxndosfyqqcekoyne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396658.3055563-220-207907232108112/AnsiballZ_systemd.py'
Nov 29 06:10:58 compute-2 sudo[47110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:59 compute-2 python3.9[47112]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:10:59 compute-2 systemd[1]: Reloading.
Nov 29 06:10:59 compute-2 systemd-sysv-generator[47147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:10:59 compute-2 systemd-rc-local-generator[47144]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:59 compute-2 systemd[1]: Starting Open vSwitch Database Unit...
Nov 29 06:10:59 compute-2 chown[47155]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 29 06:10:59 compute-2 ovs-ctl[47160]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 29 06:10:59 compute-2 ovs-ctl[47160]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 29 06:10:59 compute-2 ovs-ctl[47160]: Starting ovsdb-server [  OK  ]
Nov 29 06:10:59 compute-2 ovs-vsctl[47209]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 29 06:10:59 compute-2 ovs-vsctl[47229]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"fa6f2e5a-176a-4b37-8b2a-5aaf74119c47\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 29 06:10:59 compute-2 ovs-ctl[47160]: Configuring Open vSwitch system IDs [  OK  ]
Nov 29 06:10:59 compute-2 ovs-ctl[47160]: Enabling remote OVSDB managers [  OK  ]
Nov 29 06:10:59 compute-2 systemd[1]: Started Open vSwitch Database Unit.
Nov 29 06:10:59 compute-2 ovs-vsctl[47235]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 29 06:11:00 compute-2 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 29 06:11:00 compute-2 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 29 06:11:00 compute-2 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 29 06:11:00 compute-2 kernel: openvswitch: Open vSwitch switching datapath
Nov 29 06:11:00 compute-2 ovs-ctl[47279]: Inserting openvswitch module [  OK  ]
Nov 29 06:11:00 compute-2 ovs-ctl[47248]: Starting ovs-vswitchd [  OK  ]
Nov 29 06:11:00 compute-2 ovs-vsctl[47296]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 29 06:11:00 compute-2 ovs-ctl[47248]: Enabling remote OVSDB managers [  OK  ]
Nov 29 06:11:00 compute-2 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 29 06:11:00 compute-2 systemd[1]: Starting Open vSwitch...
Nov 29 06:11:00 compute-2 systemd[1]: Finished Open vSwitch.
Nov 29 06:11:00 compute-2 sudo[47110]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:01 compute-2 python3.9[47448]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:11:02 compute-2 sudo[47598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmnilxtlenfmumvjpmzynevpxwljfmvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396661.8361797-273-202827663650808/AnsiballZ_sefcontext.py'
Nov 29 06:11:02 compute-2 sudo[47598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:02 compute-2 python3.9[47600]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 06:11:03 compute-2 kernel: SELinux:  Converting 2744 SID table entries...
Nov 29 06:11:03 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:11:03 compute-2 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:11:03 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:11:03 compute-2 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:11:03 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:11:03 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:11:03 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:11:03 compute-2 sudo[47598]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:04 compute-2 python3.9[47755]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:11:05 compute-2 sudo[47911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulxqjrbsdyjtxtpxtvdouiwjumjqhudb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396665.6180887-327-13228713388995/AnsiballZ_dnf.py'
Nov 29 06:11:05 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 29 06:11:05 compute-2 sudo[47911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:06 compute-2 python3.9[47913]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:11:07 compute-2 sudo[47911]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:08 compute-2 sudo[48064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woyjaptthyqlevurxweknbgneanzcfls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396667.8397303-351-178523115240015/AnsiballZ_command.py'
Nov 29 06:11:08 compute-2 sudo[48064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:08 compute-2 python3.9[48066]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:11:09 compute-2 sudo[48064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:10 compute-2 sudo[48351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muxoeoslppwpftxxphbkskuoxiqxinjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396669.6115527-375-277890900233036/AnsiballZ_file.py'
Nov 29 06:11:10 compute-2 sudo[48351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:10 compute-2 python3.9[48353]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:11:10 compute-2 sudo[48351]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:11 compute-2 python3.9[48503]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:11:11 compute-2 sudo[48655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcibbltydxwimxcmhdtedzjnhxenjwdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396671.5680985-423-130005900140185/AnsiballZ_dnf.py'
Nov 29 06:11:11 compute-2 sudo[48655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:12 compute-2 python3.9[48657]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:11:14 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:11:14 compute-2 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:11:14 compute-2 systemd[1]: Reloading.
Nov 29 06:11:14 compute-2 systemd-rc-local-generator[48695]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:11:14 compute-2 systemd-sysv-generator[48698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:11:14 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:11:14 compute-2 sudo[48655]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:14 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:11:14 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:11:14 compute-2 systemd[1]: run-r7a764ade846c4b7b8f2e22f306461885.service: Deactivated successfully.
Nov 29 06:11:15 compute-2 sudo[48972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlibztxonarqwsqytebuvemygksfdczy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396675.0745862-447-97919772438089/AnsiballZ_systemd.py'
Nov 29 06:11:15 compute-2 sudo[48972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:15 compute-2 python3.9[48974]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:11:16 compute-2 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 06:11:16 compute-2 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 06:11:16 compute-2 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 06:11:16 compute-2 systemd[1]: Stopping Network Manager...
Nov 29 06:11:16 compute-2 NetworkManager[7196]: <info>  [1764396676.7333] caught SIGTERM, shutting down normally.
Nov 29 06:11:16 compute-2 NetworkManager[7196]: <info>  [1764396676.7359] dhcp4 (eth0): canceled DHCP transaction
Nov 29 06:11:16 compute-2 NetworkManager[7196]: <info>  [1764396676.7361] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:11:16 compute-2 NetworkManager[7196]: <info>  [1764396676.7362] dhcp4 (eth0): state changed no lease
Nov 29 06:11:16 compute-2 NetworkManager[7196]: <info>  [1764396676.7366] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 06:11:16 compute-2 NetworkManager[7196]: <info>  [1764396676.7571] exiting (success)
Nov 29 06:11:16 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 06:11:16 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 06:11:16 compute-2 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 06:11:16 compute-2 systemd[1]: Stopped Network Manager.
Nov 29 06:11:16 compute-2 systemd[1]: NetworkManager.service: Consumed 12.300s CPU time, 4.1M memory peak, read 0B from disk, written 38.0K to disk.
Nov 29 06:11:16 compute-2 systemd[1]: Starting Network Manager...
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.8199] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:631d2949-c1d4-4f67-afc4-db082a3ff43a)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.8200] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.8252] manager[0x55a932344090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 06:11:16 compute-2 systemd[1]: Starting Hostname Service...
Nov 29 06:11:16 compute-2 systemd[1]: Started Hostname Service.
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9340] hostname: hostname: using hostnamed
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9342] hostname: static hostname changed from (none) to "compute-2"
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9346] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9350] manager[0x55a932344090]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9350] manager[0x55a932344090]: rfkill: WWAN hardware radio set enabled
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9370] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9377] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9378] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9378] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9379] manager: Networking is enabled by state file
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9380] settings: Loaded settings plugin: keyfile (internal)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9383] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9401] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9407] dhcp: init: Using DHCP client 'internal'
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9410] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9413] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9417] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9422] device (lo): Activation: starting connection 'lo' (5b6e73d6-4c36-495c-9d49-56d866cbd8e2)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9426] device (eth0): carrier: link connected
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9429] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9432] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9433] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9436] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9441] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9444] device (eth1): carrier: link connected
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9447] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9450] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc) (indicated)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9451] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9454] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9459] device (eth1): Activation: starting connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9464] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9469] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9470] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9471] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-2 systemd[1]: Started Network Manager.
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9473] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9474] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9476] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9477] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9479] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9484] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9486] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9507] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9530] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9545] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9550] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9554] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9564] device (lo): Activation: successful, device activated.
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9581] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 06:11:16 compute-2 systemd[1]: Starting Network Manager Wait Online...
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9662] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9674] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9677] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9684] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9689] device (eth1): Activation: successful, device activated.
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9704] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9707] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9713] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9719] device (eth0): Activation: successful, device activated.
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9726] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 06:11:16 compute-2 NetworkManager[48989]: <info>  [1764396676.9732] manager: startup complete
Nov 29 06:11:16 compute-2 sudo[48972]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:16 compute-2 systemd[1]: Finished Network Manager Wait Online.
Nov 29 06:11:17 compute-2 sshd-session[49067]: Invalid user student from 92.118.39.92 port 41468
Nov 29 06:11:17 compute-2 sudo[49201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujfonuszjnseqbyvjwxjuvejcqksdcpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396677.2478104-471-278222380115248/AnsiballZ_dnf.py'
Nov 29 06:11:17 compute-2 sudo[49201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:17 compute-2 sshd-session[49067]: Connection closed by invalid user student 92.118.39.92 port 41468 [preauth]
Nov 29 06:11:17 compute-2 python3.9[49203]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:11:26 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:11:26 compute-2 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:11:26 compute-2 systemd[1]: Reloading.
Nov 29 06:11:26 compute-2 systemd-sysv-generator[49261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:11:26 compute-2 systemd-rc-local-generator[49257]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:11:26 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:11:27 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 06:11:29 compute-2 sudo[49201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:29 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:11:29 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:11:29 compute-2 systemd[1]: run-rcc5793aeba7e41e3bbf790d3a6c3aec5.service: Deactivated successfully.
Nov 29 06:11:29 compute-2 sudo[49660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccyyfrcvhlohapnzsniohhedfduixaqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396689.590552-507-48109411401427/AnsiballZ_stat.py'
Nov 29 06:11:29 compute-2 sudo[49660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:30 compute-2 python3.9[49662]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:11:30 compute-2 sudo[49660]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:30 compute-2 sudo[49812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veblscciricgubuzrlmxnansjjaqujoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396690.5005782-534-258958449570885/AnsiballZ_ini_file.py'
Nov 29 06:11:30 compute-2 sudo[49812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:31 compute-2 python3.9[49814]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:31 compute-2 sudo[49812]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:31 compute-2 sudo[49966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dumxbqdcmebbwudvrbqswrpamcsslykl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396691.5521038-564-205240776531135/AnsiballZ_ini_file.py'
Nov 29 06:11:31 compute-2 sudo[49966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:32 compute-2 python3.9[49968]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:32 compute-2 sudo[49966]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:32 compute-2 sudo[50118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhsburfsykzqrotstauwfoydbqtwhztg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396692.283123-564-262290120278666/AnsiballZ_ini_file.py'
Nov 29 06:11:32 compute-2 sudo[50118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:32 compute-2 python3.9[50120]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:32 compute-2 sudo[50118]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:33 compute-2 sudo[50270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipdupwlgpxrexittboprntdcxmqdsswj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396693.307352-609-101185401413985/AnsiballZ_ini_file.py'
Nov 29 06:11:33 compute-2 sudo[50270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:33 compute-2 python3.9[50272]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:33 compute-2 sudo[50270]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:34 compute-2 sudo[50422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyhwvvckbhznmgvehrcqywwmpabbkujg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396693.984343-609-257648154778396/AnsiballZ_ini_file.py'
Nov 29 06:11:34 compute-2 sudo[50422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:34 compute-2 python3.9[50424]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:34 compute-2 sudo[50422]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:35 compute-2 sudo[50574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeccdohpvsetwkkhyyzafoxcckmjlhpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396694.7573783-654-138020752444616/AnsiballZ_stat.py'
Nov 29 06:11:35 compute-2 sudo[50574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:35 compute-2 python3.9[50576]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:11:35 compute-2 sudo[50574]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:35 compute-2 sudo[50697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seetftddssmuafcsrcjipzxyhivvazqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396694.7573783-654-138020752444616/AnsiballZ_copy.py'
Nov 29 06:11:35 compute-2 sudo[50697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:36 compute-2 python3.9[50699]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396694.7573783-654-138020752444616/.source _original_basename=.6f2xn2e9 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:36 compute-2 sudo[50697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:36 compute-2 sudo[50849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onhdeafsvzoplhbtecwmocnnpecjftvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396696.2495778-699-274434056082840/AnsiballZ_file.py'
Nov 29 06:11:36 compute-2 sudo[50849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:36 compute-2 python3.9[50851]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:36 compute-2 sudo[50849]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:37 compute-2 sudo[51001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpjbdbohlnpjgzbcqlaulsixpqvsekhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396697.2614937-723-180037797801786/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 29 06:11:37 compute-2 sudo[51001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:37 compute-2 python3.9[51003]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 29 06:11:37 compute-2 sudo[51001]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:38 compute-2 sudo[51153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhxxtxdelmftrkhpfblsgwxpklfuhnsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396698.2773867-751-37190036948112/AnsiballZ_file.py'
Nov 29 06:11:38 compute-2 sudo[51153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:38 compute-2 python3.9[51155]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:38 compute-2 sudo[51153]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:39 compute-2 sudo[51305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrpeqpcsimtzoqrcqvcqgxquxjwhskod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396699.1902373-780-139081345780142/AnsiballZ_stat.py'
Nov 29 06:11:39 compute-2 sudo[51305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:39 compute-2 sudo[51305]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:40 compute-2 sudo[51428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhkzvakwwpwulgbwjzhisdwlsjbffyri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396699.1902373-780-139081345780142/AnsiballZ_copy.py'
Nov 29 06:11:40 compute-2 sudo[51428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:40 compute-2 sudo[51428]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:41 compute-2 sudo[51580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lipyslhkgvhmqbcgtjkxkpiqdxusqegg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396700.6050682-825-232820498981490/AnsiballZ_slurp.py'
Nov 29 06:11:41 compute-2 sudo[51580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:41 compute-2 python3.9[51582]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 29 06:11:41 compute-2 sudo[51580]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:42 compute-2 sudo[51755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocnigpbzsfwurieizdejhwcpazvhrhrz ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.787739-852-56175461338808/async_wrapper.py j690250472157 300 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.787739-852-56175461338808/AnsiballZ_edpm_os_net_config.py _'
Nov 29 06:11:42 compute-2 sudo[51755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:42 compute-2 ansible-async_wrapper.py[51757]: Invoked with j690250472157 300 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.787739-852-56175461338808/AnsiballZ_edpm_os_net_config.py _
Nov 29 06:11:42 compute-2 ansible-async_wrapper.py[51760]: Starting module and watcher
Nov 29 06:11:42 compute-2 ansible-async_wrapper.py[51760]: Start watching 51761 (300)
Nov 29 06:11:42 compute-2 ansible-async_wrapper.py[51761]: Start module (51761)
Nov 29 06:11:42 compute-2 ansible-async_wrapper.py[51757]: Return async_wrapper task started.
Nov 29 06:11:42 compute-2 sudo[51755]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:42 compute-2 python3.9[51762]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 29 06:11:43 compute-2 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 29 06:11:43 compute-2 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 29 06:11:43 compute-2 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 29 06:11:43 compute-2 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 29 06:11:43 compute-2 kernel: cfg80211: failed to load regulatory.db
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.0638] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.0666] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1482] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1489] audit: op="connection-add" uuid="ebfe54cf-96d3-49e4-b61e-3677a9d0560c" name="br-ex-br" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1501] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1503] audit: op="connection-add" uuid="017be005-bfda-476b-a9ba-d18ac711f909" name="br-ex-port" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1515] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1518] audit: op="connection-add" uuid="1085709c-fa82-42c1-b9a3-9116d2c9c85c" name="eth1-port" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1530] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1532] audit: op="connection-add" uuid="56a3b9e0-a8cf-4243-8dbe-779deca6da4e" name="vlan20-port" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1544] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1546] audit: op="connection-add" uuid="f84d3f1b-fa78-4ec7-a5e6-a9c478291891" name="vlan21-port" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1557] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1560] audit: op="connection-add" uuid="34370b0b-4ab8-4813-96eb-6607120b8615" name="vlan22-port" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1572] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1574] audit: op="connection-add" uuid="8bffdbc6-597a-42b5-85f1-65525bb0f7fd" name="vlan23-port" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1593] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1609] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1612] audit: op="connection-add" uuid="f1807500-9cc7-403f-ba10-db9bacdae9f1" name="br-ex-if" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1661] audit: op="connection-update" uuid="00e95469-28f7-5d90-a077-7f69916381bc" name="ci-private-network" args="ovs-interface.type,ipv4.dns,ipv4.never-default,ipv4.addresses,ipv4.method,ipv4.routes,ipv4.routing-rules,connection.controller,connection.port-type,connection.slave-type,connection.timestamp,connection.master,ipv6.dns,ipv6.addresses,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ipv6.routing-rules,ovs-external-ids.data" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1676] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1679] audit: op="connection-add" uuid="2453c22d-6f61-42e5-85d9-d536640dda9d" name="vlan20-if" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1693] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1696] audit: op="connection-add" uuid="2a7b0e3f-5d6a-4918-ac4e-9d3dc2d30b6e" name="vlan21-if" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1712] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1715] audit: op="connection-add" uuid="63311777-9903-4357-845b-90dd8bfaf872" name="vlan22-if" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1731] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1733] audit: op="connection-add" uuid="4526e221-80ce-4766-b424-c6e5fc20bae4" name="vlan23-if" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1746] audit: op="connection-delete" uuid="30147632-9597-375e-a51b-e6c74b52332e" name="Wired connection 1" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1758] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1771] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1777] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ebfe54cf-96d3-49e4-b61e-3677a9d0560c)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1779] audit: op="connection-activate" uuid="ebfe54cf-96d3-49e4-b61e-3677a9d0560c" name="br-ex-br" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1782] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1792] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1797] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (017be005-bfda-476b-a9ba-d18ac711f909)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1800] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1809] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1815] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1085709c-fa82-42c1-b9a3-9116d2c9c85c)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1817] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1827] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1832] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (56a3b9e0-a8cf-4243-8dbe-779deca6da4e)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1835] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1844] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1850] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (f84d3f1b-fa78-4ec7-a5e6-a9c478291891)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1853] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1862] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1868] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (34370b0b-4ab8-4813-96eb-6607120b8615)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1870] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1880] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1886] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (8bffdbc6-597a-42b5-85f1-65525bb0f7fd)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1888] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1892] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1895] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1902] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1909] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1915] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (f1807500-9cc7-403f-ba10-db9bacdae9f1)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1917] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1922] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1925] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1926] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1929] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1943] device (eth1): disconnecting for new activation request.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1944] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1947] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1949] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1950] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1952] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1956] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1960] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (2453c22d-6f61-42e5-85d9-d536640dda9d)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1961] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1963] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1965] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1966] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1969] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1973] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1977] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (2a7b0e3f-5d6a-4918-ac4e-9d3dc2d30b6e)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1978] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1980] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1982] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1983] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1986] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1990] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1994] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (63311777-9903-4357-845b-90dd8bfaf872)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1995] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1997] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.1999] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2000] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2003] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2007] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2011] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (4526e221-80ce-4766-b424-c6e5fc20bae4)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2012] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2015] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2016] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2017] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2019] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2030] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2032] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2035] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2036] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2043] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2045] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2049] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2051] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2052] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2056] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2059] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2061] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2062] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2065] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2068] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2070] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2072] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2075] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2078] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2080] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2082] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2085] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2088] dhcp4 (eth0): canceled DHCP transaction
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2088] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2089] dhcp4 (eth0): state changed no lease
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2090] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2100] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51763 uid=0 result="fail" reason="Device is not activated"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2137] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2227] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2236] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 29 06:11:45 compute-2 kernel: ovs-system: entered promiscuous mode
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2267] device (eth1): disconnecting for new activation request.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2268] audit: op="connection-activate" uuid="00e95469-28f7-5d90-a077-7f69916381bc" name="ci-private-network" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2271] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2281] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 29 06:11:45 compute-2 kernel: Timeout policy base is empty
Nov 29 06:11:45 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2291] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Nov 29 06:11:45 compute-2 systemd-udevd[51768]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2335] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 06:11:45 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2415] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2510] device (eth1): Activation: starting connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2515] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2525] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2531] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2538] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2543] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2548] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2550] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2552] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2553] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2555] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2557] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2561] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2568] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2573] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2576] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2582] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2586] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2591] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2595] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2600] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2604] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2609] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2613] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2617] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2623] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2627] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2674] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2676] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2683] device (eth1): Activation: successful, device activated.
Nov 29 06:11:45 compute-2 kernel: br-ex: entered promiscuous mode
Nov 29 06:11:45 compute-2 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 29 06:11:45 compute-2 kernel: vlan22: entered promiscuous mode
Nov 29 06:11:45 compute-2 systemd-udevd[51769]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2875] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2887] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2914] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2916] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.2924] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 kernel: vlan21: entered promiscuous mode
Nov 29 06:11:45 compute-2 kernel: vlan23: entered promiscuous mode
Nov 29 06:11:45 compute-2 systemd-udevd[51767]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3052] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3064] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3078] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3091] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3095] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3107] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 kernel: vlan20: entered promiscuous mode
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3115] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3162] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3166] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3174] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3217] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3234] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3247] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3261] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3270] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3275] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3284] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3320] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3322] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-2 NetworkManager[48989]: <info>  [1764396705.3330] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:46 compute-2 sudo[52119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djwotjemgzebpfehaeaiuqroblzgwzbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396705.9678183-852-91126200795219/AnsiballZ_async_status.py'
Nov 29 06:11:46 compute-2 sudo[52119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:46 compute-2 NetworkManager[48989]: <info>  [1764396706.4565] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 06:11:46 compute-2 python3.9[52121]: ansible-ansible.legacy.async_status Invoked with jid=j690250472157.51757 mode=status _async_dir=/root/.ansible_async
Nov 29 06:11:46 compute-2 sudo[52119]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:46 compute-2 NetworkManager[48989]: <info>  [1764396706.6757] checkpoint[0x55a93231a950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 29 06:11:46 compute-2 NetworkManager[48989]: <info>  [1764396706.6759] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 06:11:46 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.0092] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.0109] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.2570] audit: op="networking-control" arg="global-dns-configuration" pid=51763 uid=0 result="success"
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.2599] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.2637] audit: op="networking-control" arg="global-dns-configuration" pid=51763 uid=0 result="success"
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.2665] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.4824] checkpoint[0x55a93231aa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 29 06:11:47 compute-2 NetworkManager[48989]: <info>  [1764396707.4829] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 06:11:47 compute-2 ansible-async_wrapper.py[51761]: Module complete (51761)
Nov 29 06:11:47 compute-2 ansible-async_wrapper.py[51760]: Done in kid B.
Nov 29 06:11:49 compute-2 sudo[52227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfyhafxjaphmofmxrkynuixejalucqxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396705.9678183-852-91126200795219/AnsiballZ_async_status.py'
Nov 29 06:11:49 compute-2 sudo[52227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:50 compute-2 python3.9[52229]: ansible-ansible.legacy.async_status Invoked with jid=j690250472157.51757 mode=status _async_dir=/root/.ansible_async
Nov 29 06:11:50 compute-2 sudo[52227]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:50 compute-2 sudo[52327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlpbyjhhnqqbktsrkkdmdiimddliusos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396705.9678183-852-91126200795219/AnsiballZ_async_status.py'
Nov 29 06:11:50 compute-2 sudo[52327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:50 compute-2 python3.9[52329]: ansible-ansible.legacy.async_status Invoked with jid=j690250472157.51757 mode=cleanup _async_dir=/root/.ansible_async
Nov 29 06:11:50 compute-2 sudo[52327]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:51 compute-2 sudo[52479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsuxlgbjhexrmqzfvsvyfgksajxxxljg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396711.0874846-933-126930131588677/AnsiballZ_stat.py'
Nov 29 06:11:51 compute-2 sudo[52479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:51 compute-2 python3.9[52481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:11:51 compute-2 sudo[52479]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:51 compute-2 sudo[52602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loysfrabpzknuubqmklrczhrlnlvgwwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396711.0874846-933-126930131588677/AnsiballZ_copy.py'
Nov 29 06:11:51 compute-2 sudo[52602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:52 compute-2 python3.9[52604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396711.0874846-933-126930131588677/.source.returncode _original_basename=.1wwpghqs follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:52 compute-2 sudo[52602]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:52 compute-2 sudo[52754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwlmalmhndtdgzdzvajzfbkrbbqqkuwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396712.5868278-981-16913086037819/AnsiballZ_stat.py'
Nov 29 06:11:52 compute-2 sudo[52754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:53 compute-2 python3.9[52756]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:11:53 compute-2 sudo[52754]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:53 compute-2 sudo[52878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tihrnskudrmxmxktdcxjonsmgyzoqwio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396712.5868278-981-16913086037819/AnsiballZ_copy.py'
Nov 29 06:11:53 compute-2 sudo[52878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:54 compute-2 python3.9[52880]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396712.5868278-981-16913086037819/.source.cfg _original_basename=.7zg22jhc follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:54 compute-2 sudo[52878]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:54 compute-2 sudo[53030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxvaszgyspzgfbelrvcrhzhsywdzxhev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396714.3122745-1026-33164095204360/AnsiballZ_systemd.py'
Nov 29 06:11:54 compute-2 sudo[53030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:54 compute-2 python3.9[53032]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:11:55 compute-2 systemd[1]: Reloading Network Manager...
Nov 29 06:11:55 compute-2 NetworkManager[48989]: <info>  [1764396715.0530] audit: op="reload" arg="0" pid=53036 uid=0 result="success"
Nov 29 06:11:55 compute-2 NetworkManager[48989]: <info>  [1764396715.0537] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 29 06:11:55 compute-2 systemd[1]: Reloaded Network Manager.
Nov 29 06:11:55 compute-2 sudo[53030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:55 compute-2 sshd-session[44989]: Connection closed by 192.168.122.30 port 46182
Nov 29 06:11:55 compute-2 sshd-session[44986]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:11:55 compute-2 systemd[1]: session-11.scope: Deactivated successfully.
Nov 29 06:11:55 compute-2 systemd[1]: session-11.scope: Consumed 51.308s CPU time.
Nov 29 06:11:55 compute-2 systemd-logind[784]: Session 11 logged out. Waiting for processes to exit.
Nov 29 06:11:55 compute-2 systemd-logind[784]: Removed session 11.
Nov 29 06:12:00 compute-2 sshd-session[53067]: Accepted publickey for zuul from 192.168.122.30 port 41490 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:12:00 compute-2 systemd-logind[784]: New session 12 of user zuul.
Nov 29 06:12:00 compute-2 systemd[1]: Started Session 12 of User zuul.
Nov 29 06:12:00 compute-2 sshd-session[53067]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:12:01 compute-2 python3.9[53220]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:02 compute-2 python3.9[53374]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:05 compute-2 python3.9[53568]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:12:05 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 06:12:06 compute-2 sshd-session[53070]: Connection closed by 192.168.122.30 port 41490
Nov 29 06:12:06 compute-2 sshd-session[53067]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:12:06 compute-2 systemd[1]: session-12.scope: Deactivated successfully.
Nov 29 06:12:06 compute-2 systemd[1]: session-12.scope: Consumed 2.442s CPU time.
Nov 29 06:12:06 compute-2 systemd-logind[784]: Session 12 logged out. Waiting for processes to exit.
Nov 29 06:12:06 compute-2 systemd-logind[784]: Removed session 12.
Nov 29 06:12:11 compute-2 sshd-session[53597]: Accepted publickey for zuul from 192.168.122.30 port 33194 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:12:11 compute-2 systemd-logind[784]: New session 13 of user zuul.
Nov 29 06:12:11 compute-2 systemd[1]: Started Session 13 of User zuul.
Nov 29 06:12:11 compute-2 sshd-session[53597]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:12:13 compute-2 python3.9[53751]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:14 compute-2 python3.9[53905]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:15 compute-2 sudo[54059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtimtixsbuzbacjwzzkyueziqhtkbweb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396734.8847334-87-21748398660272/AnsiballZ_setup.py'
Nov 29 06:12:15 compute-2 sudo[54059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:15 compute-2 python3.9[54061]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:15 compute-2 sudo[54059]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:16 compute-2 sudo[54144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apunzboamnanasnzfhpiqgqpjiowrrxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396734.8847334-87-21748398660272/AnsiballZ_dnf.py'
Nov 29 06:12:16 compute-2 sudo[54144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:16 compute-2 python3.9[54146]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:12:17 compute-2 sudo[54144]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:18 compute-2 sudo[54297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txegxzchhugwjcskfdexxevibljomczx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396738.194772-123-2889465935579/AnsiballZ_setup.py'
Nov 29 06:12:18 compute-2 sudo[54297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:18 compute-2 python3.9[54299]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:19 compute-2 sudo[54297]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:21 compute-2 sudo[54492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxerovzhmgrlkfygeetvtjctwwhxflaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396740.6047225-156-18423116522626/AnsiballZ_file.py'
Nov 29 06:12:21 compute-2 sudo[54492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:21 compute-2 python3.9[54494]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:21 compute-2 sudo[54492]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:21 compute-2 sudo[54644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljnnsasayeqqcirzizoovibqdaygjbqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396741.4863136-181-263817681037010/AnsiballZ_command.py'
Nov 29 06:12:21 compute-2 sudo[54644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:22 compute-2 python3.9[54646]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:12:23 compute-2 podman[54647]: 2025-11-29 06:12:23.897961758 +0000 UTC m=+1.735448650 system refresh
Nov 29 06:12:23 compute-2 sudo[54644]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:24 compute-2 sudo[54807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izuvmrmahpuobtblznklfxbuzseleyga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396744.1340225-204-16835673226794/AnsiballZ_stat.py'
Nov 29 06:12:24 compute-2 sudo[54807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:24 compute-2 python3.9[54809]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:24 compute-2 sudo[54807]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:24 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:12:25 compute-2 sudo[54930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chbptyftzhuyusqsizjchuspmjymqrat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396744.1340225-204-16835673226794/AnsiballZ_copy.py'
Nov 29 06:12:25 compute-2 sudo[54930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:25 compute-2 python3.9[54932]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396744.1340225-204-16835673226794/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9d8d6fb48c24217b2ff710035753b7137f3c873e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:25 compute-2 sudo[54930]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:26 compute-2 sudo[55082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lurcopadvipfcrwnmbirsiqskgwkoftx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396745.769978-250-154268522424042/AnsiballZ_stat.py'
Nov 29 06:12:26 compute-2 sudo[55082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:26 compute-2 python3.9[55084]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:26 compute-2 sudo[55082]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:26 compute-2 sudo[55205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzigpijnawsopwisaupmzpzjvqsofkuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396745.769978-250-154268522424042/AnsiballZ_copy.py'
Nov 29 06:12:26 compute-2 sudo[55205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:27 compute-2 python3.9[55207]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396745.769978-250-154268522424042/.source.conf follow=False _original_basename=registries.conf.j2 checksum=25aa6c560e50dcbd81b989ea46a7865cb55b8998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:27 compute-2 sudo[55205]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:27 compute-2 sudo[55357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlkknqajyunbylxwvwwhubbyyjbzgvbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396747.48321-298-80685543313716/AnsiballZ_ini_file.py'
Nov 29 06:12:27 compute-2 sudo[55357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:28 compute-2 python3.9[55359]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:28 compute-2 sudo[55357]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:28 compute-2 sudo[55509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxsjeqbzistrstmcfrruvysxocxdhpkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396748.3011725-298-211331987611329/AnsiballZ_ini_file.py'
Nov 29 06:12:28 compute-2 sudo[55509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:28 compute-2 python3.9[55511]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:28 compute-2 sudo[55509]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:29 compute-2 sudo[55661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxgmfayhdvjubmsjhmygddrcndisligj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396749.022845-298-241679018442489/AnsiballZ_ini_file.py'
Nov 29 06:12:29 compute-2 sudo[55661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:29 compute-2 python3.9[55663]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:29 compute-2 sudo[55661]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:30 compute-2 sudo[55813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mucofulayriocmusjhwbxrdiwkxnqmpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396749.7313747-298-70320519811695/AnsiballZ_ini_file.py'
Nov 29 06:12:30 compute-2 sudo[55813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:30 compute-2 python3.9[55815]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:30 compute-2 sudo[55813]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:31 compute-2 sudo[55965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzgdclckdxnchtdwsmjwqvnroqsoabno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396750.857278-390-562639658510/AnsiballZ_dnf.py'
Nov 29 06:12:31 compute-2 sudo[55965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:31 compute-2 python3.9[55967]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:12:32 compute-2 sudo[55965]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:33 compute-2 sudo[56118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhkpsrvnpaucchbjevkydtlhcwciskmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396753.2798247-423-187371563580663/AnsiballZ_setup.py'
Nov 29 06:12:33 compute-2 sudo[56118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:33 compute-2 python3.9[56120]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:33 compute-2 sudo[56118]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:34 compute-2 sudo[56272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppqoxrzycpofonkbgaceifbusvprqgab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396754.1951077-447-190494250900959/AnsiballZ_stat.py'
Nov 29 06:12:34 compute-2 sudo[56272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:34 compute-2 python3.9[56274]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:12:34 compute-2 sudo[56272]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:35 compute-2 sudo[56424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqtlgssetljlmetwsjsdtutiqbqrrrcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396755.3521988-474-187052560352235/AnsiballZ_stat.py'
Nov 29 06:12:35 compute-2 sudo[56424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:35 compute-2 python3.9[56426]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:12:35 compute-2 sudo[56424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:36 compute-2 sudo[56576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agvdcwjisrkxkxizjxdrhqlfdzhxmkgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396756.3345826-505-127565849265780/AnsiballZ_command.py'
Nov 29 06:12:36 compute-2 sudo[56576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:36 compute-2 python3.9[56578]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:12:36 compute-2 sudo[56576]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:37 compute-2 sudo[56729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgtypfjyzclccozduazhsjzwuhisizbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396757.349032-534-202427377276078/AnsiballZ_service_facts.py'
Nov 29 06:12:37 compute-2 sudo[56729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:38 compute-2 python3.9[56731]: ansible-service_facts Invoked
Nov 29 06:12:38 compute-2 network[56748]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:12:38 compute-2 network[56749]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:12:38 compute-2 network[56750]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:12:42 compute-2 sudo[56729]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:43 compute-2 sudo[57033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onlinbkzrnvmftcjknyszrvwfgufticf ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764396763.5262036-580-159141156121198/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764396763.5262036-580-159141156121198/args'
Nov 29 06:12:43 compute-2 sudo[57033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:44 compute-2 sudo[57033]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:44 compute-2 sudo[57200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oudvlgnpclzbiqeosxfzfeluiwiicjoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396764.4015894-612-249860603498117/AnsiballZ_dnf.py'
Nov 29 06:12:44 compute-2 sudo[57200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:44 compute-2 python3.9[57202]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:12:46 compute-2 sudo[57200]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:48 compute-2 sudo[57353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcjebwwbigfneouniwubdfkpjaloxcud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396767.878011-652-280006387479672/AnsiballZ_package_facts.py'
Nov 29 06:12:48 compute-2 sudo[57353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:48 compute-2 python3.9[57355]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 06:12:48 compute-2 sudo[57353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:50 compute-2 sudo[57505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akzuapuymjbtxtdkwuszscbrgmunuhbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396769.836312-683-93344662891375/AnsiballZ_stat.py'
Nov 29 06:12:50 compute-2 sudo[57505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:50 compute-2 python3.9[57507]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:50 compute-2 sudo[57505]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:50 compute-2 sudo[57630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmhefvrsupuhqxyzlsnmzzwtxisxpiaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396769.836312-683-93344662891375/AnsiballZ_copy.py'
Nov 29 06:12:50 compute-2 sudo[57630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:51 compute-2 python3.9[57632]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396769.836312-683-93344662891375/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:51 compute-2 sudo[57630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:51 compute-2 sudo[57784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plksmilknfcqhxxhvvvxjxcbtrnysnkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396771.4855196-728-266827799762230/AnsiballZ_stat.py'
Nov 29 06:12:51 compute-2 sudo[57784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:52 compute-2 python3.9[57786]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:52 compute-2 sudo[57784]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:52 compute-2 sudo[57909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msncigqomumdetcuhfgskuhfviydnxtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396771.4855196-728-266827799762230/AnsiballZ_copy.py'
Nov 29 06:12:52 compute-2 sudo[57909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:52 compute-2 python3.9[57911]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396771.4855196-728-266827799762230/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:52 compute-2 sudo[57909]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:54 compute-2 sudo[58063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlqjfqbjytcgldbavhdoyoxgguofltyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396773.803531-792-94427409766067/AnsiballZ_lineinfile.py'
Nov 29 06:12:54 compute-2 sudo[58063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:54 compute-2 python3.9[58065]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:54 compute-2 sudo[58063]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:56 compute-2 sudo[58217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqzhzbgkyjoofrhzurjuyexjxvdshqda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396775.6595497-836-273622485370752/AnsiballZ_setup.py'
Nov 29 06:12:56 compute-2 sudo[58217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:56 compute-2 python3.9[58219]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:56 compute-2 sudo[58217]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:57 compute-2 sudo[58301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxnegpypwnbidjjrolbrtevrjndicytw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396775.6595497-836-273622485370752/AnsiballZ_systemd.py'
Nov 29 06:12:57 compute-2 sudo[58301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:57 compute-2 python3.9[58303]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:12:57 compute-2 sudo[58301]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:58 compute-2 sudo[58455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fswbvzftzxmsuezegvpsblnyigafujse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396778.466896-883-73788001503817/AnsiballZ_setup.py'
Nov 29 06:12:58 compute-2 sudo[58455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:59 compute-2 python3.9[58457]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:59 compute-2 sudo[58455]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:59 compute-2 sudo[58539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddvyegpvktaovkleyllwenpzvekjnja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396778.466896-883-73788001503817/AnsiballZ_systemd.py'
Nov 29 06:12:59 compute-2 sudo[58539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:59 compute-2 python3.9[58541]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:12:59 compute-2 chronyd[787]: chronyd exiting
Nov 29 06:12:59 compute-2 systemd[1]: Stopping NTP client/server...
Nov 29 06:12:59 compute-2 systemd[1]: chronyd.service: Deactivated successfully.
Nov 29 06:12:59 compute-2 systemd[1]: Stopped NTP client/server.
Nov 29 06:12:59 compute-2 systemd[1]: Starting NTP client/server...
Nov 29 06:12:59 compute-2 chronyd[58550]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 06:12:59 compute-2 chronyd[58550]: Frequency -26.017 +/- 0.314 ppm read from /var/lib/chrony/drift
Nov 29 06:12:59 compute-2 chronyd[58550]: Loaded seccomp filter (level 2)
Nov 29 06:12:59 compute-2 systemd[1]: Started NTP client/server.
Nov 29 06:13:00 compute-2 sudo[58539]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:00 compute-2 sshd-session[53600]: Connection closed by 192.168.122.30 port 33194
Nov 29 06:13:00 compute-2 sshd-session[53597]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:13:00 compute-2 systemd[1]: session-13.scope: Deactivated successfully.
Nov 29 06:13:00 compute-2 systemd[1]: session-13.scope: Consumed 28.003s CPU time.
Nov 29 06:13:00 compute-2 systemd-logind[784]: Session 13 logged out. Waiting for processes to exit.
Nov 29 06:13:00 compute-2 systemd-logind[784]: Removed session 13.
Nov 29 06:13:06 compute-2 sshd-session[58576]: Accepted publickey for zuul from 192.168.122.30 port 35916 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:13:06 compute-2 systemd-logind[784]: New session 14 of user zuul.
Nov 29 06:13:06 compute-2 systemd[1]: Started Session 14 of User zuul.
Nov 29 06:13:06 compute-2 sshd-session[58576]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:13:06 compute-2 sudo[58729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejyiyavxcxsefemkgujpiktwfsdebofj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396786.330517-34-46713988931445/AnsiballZ_file.py'
Nov 29 06:13:06 compute-2 sudo[58729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:07 compute-2 python3.9[58731]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:07 compute-2 sudo[58729]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:07 compute-2 sudo[58881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbwcdvbkpyfmtnxsnxzeapjwjnjywgcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396787.308909-71-207119636941577/AnsiballZ_stat.py'
Nov 29 06:13:07 compute-2 sudo[58881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:07 compute-2 python3.9[58883]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:07 compute-2 sudo[58881]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:08 compute-2 sudo[59004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhtfeeldjfkqiptmqppsgsqxlrrjlfiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396787.308909-71-207119636941577/AnsiballZ_copy.py'
Nov 29 06:13:08 compute-2 sudo[59004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:08 compute-2 python3.9[59006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396787.308909-71-207119636941577/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:08 compute-2 sudo[59004]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:09 compute-2 sshd-session[58579]: Connection closed by 192.168.122.30 port 35916
Nov 29 06:13:09 compute-2 sshd-session[58576]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:13:09 compute-2 systemd[1]: session-14.scope: Deactivated successfully.
Nov 29 06:13:09 compute-2 systemd[1]: session-14.scope: Consumed 1.590s CPU time.
Nov 29 06:13:09 compute-2 systemd-logind[784]: Session 14 logged out. Waiting for processes to exit.
Nov 29 06:13:09 compute-2 systemd-logind[784]: Removed session 14.
Nov 29 06:13:14 compute-2 sshd-session[59031]: Accepted publickey for zuul from 192.168.122.30 port 35918 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:13:14 compute-2 systemd-logind[784]: New session 15 of user zuul.
Nov 29 06:13:14 compute-2 systemd[1]: Started Session 15 of User zuul.
Nov 29 06:13:14 compute-2 sshd-session[59031]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:13:15 compute-2 python3.9[59184]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:13:16 compute-2 sudo[59338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcxqacmnkwbibzhyyyymjdinpihnqzbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396796.161554-65-26161811212696/AnsiballZ_file.py'
Nov 29 06:13:16 compute-2 sudo[59338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:16 compute-2 python3.9[59340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:16 compute-2 sudo[59338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:17 compute-2 sudo[59513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfdrbbnjagkcksraunvunrutdrxhlghd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396797.140119-89-253032209619510/AnsiballZ_stat.py'
Nov 29 06:13:17 compute-2 sudo[59513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:17 compute-2 python3.9[59515]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:17 compute-2 sudo[59513]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:18 compute-2 sudo[59636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dedepwulphhulmcgnofuqdttdwkrbsdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396797.140119-89-253032209619510/AnsiballZ_copy.py'
Nov 29 06:13:18 compute-2 sudo[59636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:18 compute-2 python3.9[59638]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764396797.140119-89-253032209619510/.source.json _original_basename=.ibopix0k follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:18 compute-2 sudo[59636]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:19 compute-2 sudo[59788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qggsvasssxplmyztnvvtcbinehfbyaje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396799.2680554-158-263538102062156/AnsiballZ_stat.py'
Nov 29 06:13:19 compute-2 sudo[59788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:19 compute-2 python3.9[59790]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:19 compute-2 sudo[59788]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:20 compute-2 sudo[59911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nakqsaktyjbqceuljoszlpnukceplaxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396799.2680554-158-263538102062156/AnsiballZ_copy.py'
Nov 29 06:13:20 compute-2 sudo[59911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:20 compute-2 python3.9[59913]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396799.2680554-158-263538102062156/.source _original_basename=.xppcstfp follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:20 compute-2 sudo[59911]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:21 compute-2 sudo[60063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwngseeppzhyitwjtbkxtfbzlgmnssjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396801.094703-206-34693832598556/AnsiballZ_file.py'
Nov 29 06:13:21 compute-2 sudo[60063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:21 compute-2 python3.9[60065]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:13:21 compute-2 sudo[60063]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:22 compute-2 sudo[60215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bknuafudtkxanmfzzyimpowsgytazeed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396801.9480999-231-47786330041578/AnsiballZ_stat.py'
Nov 29 06:13:22 compute-2 sudo[60215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:22 compute-2 python3.9[60217]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:22 compute-2 sudo[60215]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:22 compute-2 sudo[60338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lumqlsoopuuyeebhqxohtsiwwarcgtki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396801.9480999-231-47786330041578/AnsiballZ_copy.py'
Nov 29 06:13:22 compute-2 sudo[60338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:23 compute-2 python3.9[60340]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396801.9480999-231-47786330041578/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:13:23 compute-2 sudo[60338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:23 compute-2 sudo[60491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhyultqpyzcvfdszlcbnrgwcjigwwglx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396803.2062855-231-225834578073087/AnsiballZ_stat.py'
Nov 29 06:13:23 compute-2 sudo[60491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:23 compute-2 python3.9[60493]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:23 compute-2 sudo[60491]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:23 compute-2 sudo[60616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zabylsefpltvgeeubvotdgtjgthogypf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396803.2062855-231-225834578073087/AnsiballZ_copy.py'
Nov 29 06:13:23 compute-2 sudo[60616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:24 compute-2 python3.9[60618]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396803.2062855-231-225834578073087/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:13:24 compute-2 sudo[60616]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:24 compute-2 sshd-session[60597]: Connection closed by authenticating user root 92.118.39.92 port 34830 [preauth]
Nov 29 06:13:24 compute-2 sudo[60768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfbfgoajkqofywimswcukexwdarqlrpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396804.6152382-317-125796461103464/AnsiballZ_file.py'
Nov 29 06:13:24 compute-2 sudo[60768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:25 compute-2 python3.9[60770]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:25 compute-2 sudo[60768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:25 compute-2 sudo[60920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqvjnuxuyjvbslnqlztccudjhladcrve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396805.4738147-342-1478881823389/AnsiballZ_stat.py'
Nov 29 06:13:25 compute-2 sudo[60920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:26 compute-2 python3.9[60922]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:26 compute-2 sudo[60920]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:26 compute-2 sudo[61043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjhtcbgrdnysngigogjjlrubixdunfac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396805.4738147-342-1478881823389/AnsiballZ_copy.py'
Nov 29 06:13:26 compute-2 sudo[61043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:26 compute-2 python3.9[61045]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396805.4738147-342-1478881823389/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:26 compute-2 sudo[61043]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:27 compute-2 sudo[61195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxhpeigivkdghibxpzwkmgsncepmowur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396806.9685643-386-201103530186761/AnsiballZ_stat.py'
Nov 29 06:13:27 compute-2 sudo[61195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:27 compute-2 python3.9[61197]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:27 compute-2 sudo[61195]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:28 compute-2 sudo[61318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agngtipnmfmkpkipyofurpgnwyfvvrdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396806.9685643-386-201103530186761/AnsiballZ_copy.py'
Nov 29 06:13:28 compute-2 sudo[61318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:28 compute-2 python3.9[61320]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396806.9685643-386-201103530186761/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:28 compute-2 sudo[61318]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:29 compute-2 sudo[61470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smjxkyyluemnwsclptzuytndmuooyqjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396808.5760434-431-170546651411632/AnsiballZ_systemd.py'
Nov 29 06:13:29 compute-2 sudo[61470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:29 compute-2 python3.9[61472]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:29 compute-2 systemd[1]: Reloading.
Nov 29 06:13:29 compute-2 systemd-sysv-generator[61503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:29 compute-2 systemd-rc-local-generator[61500]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:29 compute-2 systemd[1]: Reloading.
Nov 29 06:13:29 compute-2 systemd-sysv-generator[61534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:29 compute-2 systemd-rc-local-generator[61529]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:30 compute-2 systemd[1]: Starting EDPM Container Shutdown...
Nov 29 06:13:30 compute-2 systemd[1]: Finished EDPM Container Shutdown.
Nov 29 06:13:30 compute-2 sudo[61470]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:31 compute-2 sudo[61697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coygvlijixocjqckqkueszedzelmmbra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396810.8107858-456-252325729249471/AnsiballZ_stat.py'
Nov 29 06:13:31 compute-2 sudo[61697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:31 compute-2 python3.9[61699]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:31 compute-2 sudo[61697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:31 compute-2 sudo[61820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naprutrjcalyjjkrluochjgnptfmwnqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396810.8107858-456-252325729249471/AnsiballZ_copy.py'
Nov 29 06:13:31 compute-2 sudo[61820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:32 compute-2 python3.9[61822]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396810.8107858-456-252325729249471/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:32 compute-2 sudo[61820]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:32 compute-2 sudo[61972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vchewhgpewewyjyjvprikyawqedkcgrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396812.2890506-501-46785231949354/AnsiballZ_stat.py'
Nov 29 06:13:32 compute-2 sudo[61972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:32 compute-2 python3.9[61974]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:32 compute-2 sudo[61972]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:33 compute-2 sudo[62095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pphrhqrulndnsqhetmlaomxrykvvyqrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396812.2890506-501-46785231949354/AnsiballZ_copy.py'
Nov 29 06:13:33 compute-2 sudo[62095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:33 compute-2 python3.9[62097]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396812.2890506-501-46785231949354/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:33 compute-2 sudo[62095]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:34 compute-2 sudo[62247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riqwcnwxgfungsipzkvlebgqtnholgfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396813.7582955-546-280658584529921/AnsiballZ_systemd.py'
Nov 29 06:13:34 compute-2 sudo[62247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:34 compute-2 python3.9[62249]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:34 compute-2 systemd[1]: Reloading.
Nov 29 06:13:34 compute-2 systemd-rc-local-generator[62279]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:34 compute-2 systemd-sysv-generator[62284]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:34 compute-2 systemd[1]: Reloading.
Nov 29 06:13:34 compute-2 systemd-rc-local-generator[62317]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:34 compute-2 systemd-sysv-generator[62321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:34 compute-2 systemd[1]: Starting Create netns directory...
Nov 29 06:13:34 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:13:34 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:13:34 compute-2 systemd[1]: Finished Create netns directory.
Nov 29 06:13:34 compute-2 sudo[62247]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:36 compute-2 python3.9[62476]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:13:36 compute-2 network[62493]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:13:36 compute-2 network[62494]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:13:36 compute-2 network[62495]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:13:40 compute-2 sudo[62755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtvggylmwqqlebikcptctjlhgvqsrnjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396820.519533-593-192890757233812/AnsiballZ_systemd.py'
Nov 29 06:13:40 compute-2 sudo[62755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:41 compute-2 python3.9[62757]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:41 compute-2 systemd[1]: Reloading.
Nov 29 06:13:41 compute-2 systemd-rc-local-generator[62786]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:41 compute-2 systemd-sysv-generator[62791]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:41 compute-2 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 29 06:13:41 compute-2 iptables.init[62798]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 29 06:13:41 compute-2 iptables.init[62798]: iptables: Flushing firewall rules: [  OK  ]
Nov 29 06:13:41 compute-2 systemd[1]: iptables.service: Deactivated successfully.
Nov 29 06:13:41 compute-2 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 29 06:13:41 compute-2 sudo[62755]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:42 compute-2 sudo[62993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhpxssyqmrnpkmsibilcfvoaexhadzol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396821.845836-593-101661068763758/AnsiballZ_systemd.py'
Nov 29 06:13:42 compute-2 sudo[62993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:42 compute-2 python3.9[62995]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:42 compute-2 sudo[62993]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:44 compute-2 sudo[63147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mousixswrrnwdbajbzmljahdqmkyewnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396824.3888407-641-133594693741068/AnsiballZ_systemd.py'
Nov 29 06:13:44 compute-2 sudo[63147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:45 compute-2 python3.9[63149]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:45 compute-2 systemd[1]: Reloading.
Nov 29 06:13:45 compute-2 systemd-rc-local-generator[63171]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:45 compute-2 systemd-sysv-generator[63176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:45 compute-2 systemd[1]: Starting Netfilter Tables...
Nov 29 06:13:45 compute-2 systemd[1]: Finished Netfilter Tables.
Nov 29 06:13:45 compute-2 sudo[63147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:46 compute-2 sudo[63338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqvzhzvzgkianwojbmurvxiqxbvwdpaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396825.7874331-666-74019099714111/AnsiballZ_command.py'
Nov 29 06:13:46 compute-2 sudo[63338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:46 compute-2 python3.9[63340]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:13:46 compute-2 sudo[63338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:47 compute-2 sudo[63491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvatapxgzhwflmsmijvdwobhejbbkvpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396827.1172018-708-262227662948596/AnsiballZ_stat.py'
Nov 29 06:13:47 compute-2 sudo[63491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:47 compute-2 python3.9[63493]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:47 compute-2 sudo[63491]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:48 compute-2 sudo[63616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtthzyzqdyummyngkscbxipwelegnowe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396827.1172018-708-262227662948596/AnsiballZ_copy.py'
Nov 29 06:13:48 compute-2 sudo[63616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:48 compute-2 python3.9[63618]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396827.1172018-708-262227662948596/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:48 compute-2 sudo[63616]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:49 compute-2 sudo[63769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhhzufjwibibnsgdovsjfzliiwqjinwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396828.7770948-753-205530088758611/AnsiballZ_systemd.py'
Nov 29 06:13:49 compute-2 sudo[63769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:49 compute-2 python3.9[63771]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:13:49 compute-2 systemd[1]: Reloading OpenSSH server daemon...
Nov 29 06:13:49 compute-2 sshd[1004]: Received SIGHUP; restarting.
Nov 29 06:13:49 compute-2 sshd[1004]: Server listening on 0.0.0.0 port 22.
Nov 29 06:13:49 compute-2 sshd[1004]: Server listening on :: port 22.
Nov 29 06:13:49 compute-2 systemd[1]: Reloaded OpenSSH server daemon.
Nov 29 06:13:49 compute-2 sudo[63769]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:49 compute-2 sudo[63925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzfvzmpiqcmmvpdldmxzgzuxmwqbqrku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396829.7235315-777-157172996801686/AnsiballZ_file.py'
Nov 29 06:13:49 compute-2 sudo[63925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:50 compute-2 python3.9[63927]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:50 compute-2 sudo[63925]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:50 compute-2 sudo[64077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khvecqfuhlvdxhudjamqmzlmawopgeux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396830.5878987-801-137183532355338/AnsiballZ_stat.py'
Nov 29 06:13:50 compute-2 sudo[64077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:51 compute-2 python3.9[64079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:51 compute-2 sudo[64077]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:51 compute-2 sudo[64200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swpcdzvwfgwpucqydgphvafhulsrpdbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396830.5878987-801-137183532355338/AnsiballZ_copy.py'
Nov 29 06:13:51 compute-2 sudo[64200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:51 compute-2 python3.9[64202]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396830.5878987-801-137183532355338/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:51 compute-2 sudo[64200]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:52 compute-2 sudo[64353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdshtutqwwydtujazetrgcvtguwidamp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396832.4299965-855-93293040275456/AnsiballZ_timezone.py'
Nov 29 06:13:52 compute-2 sudo[64353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:53 compute-2 python3.9[64355]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 06:13:53 compute-2 systemd[1]: Starting Time & Date Service...
Nov 29 06:13:53 compute-2 systemd[1]: Started Time & Date Service.
Nov 29 06:13:53 compute-2 sudo[64353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:53 compute-2 sudo[64509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjuheumdkrytvmqeckknmaeefcqrgsvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396833.692132-882-71132371720094/AnsiballZ_file.py'
Nov 29 06:13:53 compute-2 sudo[64509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:54 compute-2 python3.9[64511]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:54 compute-2 sudo[64509]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:54 compute-2 sudo[64661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzikepmadrdaphlgtdyuiwpoxsqvhict ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396834.5090587-906-69822310045952/AnsiballZ_stat.py'
Nov 29 06:13:54 compute-2 sudo[64661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:55 compute-2 python3.9[64663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:55 compute-2 sudo[64661]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:55 compute-2 sudo[64784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhqjbtuqntheykinznbseillykjtopgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396834.5090587-906-69822310045952/AnsiballZ_copy.py'
Nov 29 06:13:55 compute-2 sudo[64784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:55 compute-2 python3.9[64786]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396834.5090587-906-69822310045952/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:55 compute-2 sudo[64784]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:56 compute-2 sudo[64936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjtsjgdvnymcbnbaiwhxczdavonncpsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396836.0172725-951-131113944525259/AnsiballZ_stat.py'
Nov 29 06:13:56 compute-2 sudo[64936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:56 compute-2 python3.9[64938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:56 compute-2 sudo[64936]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:56 compute-2 sudo[65059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrhddfrsffabhxbkcqkuxxoywhenbbwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396836.0172725-951-131113944525259/AnsiballZ_copy.py'
Nov 29 06:13:56 compute-2 sudo[65059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:56 compute-2 python3.9[65061]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396836.0172725-951-131113944525259/.source.yaml _original_basename=.kxc91fh0 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:56 compute-2 sudo[65059]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:57 compute-2 sudo[65211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlqtnlnocmzlixopramtjmjzfwlsrvoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396837.5864322-995-277978008204517/AnsiballZ_stat.py'
Nov 29 06:13:57 compute-2 sudo[65211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:58 compute-2 python3.9[65213]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:58 compute-2 sudo[65211]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:58 compute-2 sudo[65334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfgzahyjwskhteqkyjflsosbebzubqty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396837.5864322-995-277978008204517/AnsiballZ_copy.py'
Nov 29 06:13:58 compute-2 sudo[65334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:58 compute-2 python3.9[65336]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396837.5864322-995-277978008204517/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:58 compute-2 sudo[65334]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:59 compute-2 sudo[65486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dafwtyetnvopitaoowqufnotebyokjoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396839.2427914-1041-193185527744815/AnsiballZ_command.py'
Nov 29 06:13:59 compute-2 sudo[65486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:59 compute-2 python3.9[65488]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:13:59 compute-2 sudo[65486]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:00 compute-2 sudo[65639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmfnshrnimmmyyzqqxmvotbimvqmwtnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396840.1216435-1065-203452004411844/AnsiballZ_command.py'
Nov 29 06:14:00 compute-2 sudo[65639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:00 compute-2 python3.9[65641]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:00 compute-2 sudo[65639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:01 compute-2 sudo[65792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hokxptfzuyfnnmxgrqpxuvxbsnlygswp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764396841.063011-1089-5939841444948/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:14:01 compute-2 sudo[65792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:01 compute-2 python3[65794]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:14:01 compute-2 sudo[65792]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:02 compute-2 sudo[65944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvpuajbidsyxqzvijtqhxywfxkaxwyzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396841.9311297-1113-108322063342529/AnsiballZ_stat.py'
Nov 29 06:14:02 compute-2 sudo[65944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:02 compute-2 python3.9[65946]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:02 compute-2 sudo[65944]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:02 compute-2 sudo[66067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxyjmnyhdmqupasyezzbusmjybqabami ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396841.9311297-1113-108322063342529/AnsiballZ_copy.py'
Nov 29 06:14:02 compute-2 sudo[66067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:02 compute-2 python3.9[66069]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396841.9311297-1113-108322063342529/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:02 compute-2 sudo[66067]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:03 compute-2 sudo[66219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyqqfwjzxpwzetkeybzrercrgqsqrjbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396843.5848095-1158-148001169850308/AnsiballZ_stat.py'
Nov 29 06:14:03 compute-2 sudo[66219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:04 compute-2 python3.9[66221]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:04 compute-2 sudo[66219]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:04 compute-2 sudo[66342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-porvrieplzszlmkueuaujtgifylniclr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396843.5848095-1158-148001169850308/AnsiballZ_copy.py'
Nov 29 06:14:04 compute-2 sudo[66342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:04 compute-2 python3.9[66344]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396843.5848095-1158-148001169850308/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:04 compute-2 sudo[66342]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:05 compute-2 sudo[66494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptxgmoizqubppmdyuzttjdyepibsadfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396845.1424658-1203-37248642413795/AnsiballZ_stat.py'
Nov 29 06:14:05 compute-2 sudo[66494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:05 compute-2 python3.9[66496]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:05 compute-2 sudo[66494]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:05 compute-2 sudo[66617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfmhqxdxqpdqcqgissqwxztuuhiockdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396845.1424658-1203-37248642413795/AnsiballZ_copy.py'
Nov 29 06:14:05 compute-2 sudo[66617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:06 compute-2 python3.9[66619]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396845.1424658-1203-37248642413795/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:06 compute-2 sudo[66617]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:07 compute-2 sudo[66769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbazxdskynifnypjuqqnbzekvoiywwvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396846.6796222-1247-36403981546541/AnsiballZ_stat.py'
Nov 29 06:14:07 compute-2 sudo[66769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:07 compute-2 python3.9[66771]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:07 compute-2 sudo[66769]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:07 compute-2 sudo[66892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-padlibivdpzfjslrvwccdjytjlksfsnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396846.6796222-1247-36403981546541/AnsiballZ_copy.py'
Nov 29 06:14:07 compute-2 sudo[66892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:07 compute-2 python3.9[66894]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396846.6796222-1247-36403981546541/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:07 compute-2 sudo[66892]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:08 compute-2 irqbalance[780]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 29 06:14:08 compute-2 irqbalance[780]: IRQ 26 affinity is now unmanaged
Nov 29 06:14:08 compute-2 sudo[67044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaptqlosxpcwfxbwtssoyivmskiofvpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396848.2262766-1293-262802204904238/AnsiballZ_stat.py'
Nov 29 06:14:08 compute-2 sudo[67044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:08 compute-2 python3.9[67046]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:08 compute-2 sudo[67044]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:09 compute-2 sudo[67167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlwuslirftmamxysdesfujmjwwvpkyeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396848.2262766-1293-262802204904238/AnsiballZ_copy.py'
Nov 29 06:14:09 compute-2 sudo[67167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:09 compute-2 python3.9[67169]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396848.2262766-1293-262802204904238/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:09 compute-2 sudo[67167]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:10 compute-2 sudo[67319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzitiiiyfywqkkevnhyuwknfsntuoflz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396849.8190029-1338-140752843193680/AnsiballZ_file.py'
Nov 29 06:14:10 compute-2 sudo[67319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:10 compute-2 python3.9[67321]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:10 compute-2 sudo[67319]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:11 compute-2 sudo[67471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siywzoatckeaccnurevbjqjshhcvgrqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396850.66831-1362-47401078439307/AnsiballZ_command.py'
Nov 29 06:14:11 compute-2 sudo[67471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:11 compute-2 python3.9[67473]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:11 compute-2 sudo[67471]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:12 compute-2 sudo[67630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihewzwdnhzlfeupyijmbtufkdpyztvoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396851.6340237-1386-215789292675439/AnsiballZ_blockinfile.py'
Nov 29 06:14:12 compute-2 sudo[67630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:12 compute-2 python3.9[67632]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:12 compute-2 sudo[67630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:13 compute-2 sudo[67783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoydbdtzfqeczgixbztpoicplnvmskxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396852.8786924-1412-202207986677513/AnsiballZ_file.py'
Nov 29 06:14:13 compute-2 sudo[67783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:13 compute-2 python3.9[67785]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:13 compute-2 sudo[67783]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:14 compute-2 sudo[67935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opappitizkagcjvjoajmegicegyaipha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396853.9301243-1412-149768878441881/AnsiballZ_file.py'
Nov 29 06:14:14 compute-2 sudo[67935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:14 compute-2 python3.9[67937]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:14 compute-2 sudo[67935]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:15 compute-2 sudo[68087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biigmacxfrmbkueiksgxpiuetxrmfwdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396854.817806-1457-103055379498883/AnsiballZ_mount.py'
Nov 29 06:14:15 compute-2 sudo[68087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:15 compute-2 python3.9[68089]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:14:15 compute-2 sudo[68087]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:15 compute-2 sudo[68240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-howxrvuufpsnsnevietualozkvhudyiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396855.5609121-1457-279243391439091/AnsiballZ_mount.py'
Nov 29 06:14:15 compute-2 sudo[68240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:16 compute-2 python3.9[68242]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:14:16 compute-2 sudo[68240]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:16 compute-2 sshd-session[59034]: Connection closed by 192.168.122.30 port 35918
Nov 29 06:14:16 compute-2 sshd-session[59031]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:14:16 compute-2 systemd-logind[784]: Session 15 logged out. Waiting for processes to exit.
Nov 29 06:14:16 compute-2 systemd[1]: session-15.scope: Deactivated successfully.
Nov 29 06:14:16 compute-2 systemd[1]: session-15.scope: Consumed 38.027s CPU time.
Nov 29 06:14:16 compute-2 systemd-logind[784]: Removed session 15.
Nov 29 06:14:23 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 06:14:26 compute-2 sshd-session[68271]: Accepted publickey for zuul from 192.168.122.30 port 41564 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:14:26 compute-2 systemd-logind[784]: New session 16 of user zuul.
Nov 29 06:14:26 compute-2 systemd[1]: Started Session 16 of User zuul.
Nov 29 06:14:26 compute-2 sshd-session[68271]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:14:27 compute-2 sudo[68424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtssqtgluwnkisxbdfbxnxlcgurhlzfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396866.648143-25-181113549064180/AnsiballZ_tempfile.py'
Nov 29 06:14:27 compute-2 sudo[68424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:27 compute-2 python3.9[68426]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 06:14:27 compute-2 sudo[68424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:28 compute-2 sudo[68576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twneuhxxqazpdvyqjkhkxghiurnjharp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396867.6591983-61-115209770087889/AnsiballZ_stat.py'
Nov 29 06:14:28 compute-2 sudo[68576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:28 compute-2 python3.9[68578]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:14:28 compute-2 sudo[68576]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:29 compute-2 sudo[68728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orucqrmhmqdahkwilxrvrmefjxgribmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396868.6981096-91-210159701479495/AnsiballZ_setup.py'
Nov 29 06:14:29 compute-2 sudo[68728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:29 compute-2 python3.9[68730]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:14:29 compute-2 sudo[68728]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:30 compute-2 sudo[68880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihenagewfjazdiefmxphufwjmymmsoeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396869.9902098-116-96595196134121/AnsiballZ_blockinfile.py'
Nov 29 06:14:30 compute-2 sudo[68880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:30 compute-2 python3.9[68882]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=
                                             create=True mode=0644 path=/tmp/ansible.ayjbkyvy state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:30 compute-2 sudo[68880]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:31 compute-2 sudo[69032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwzjnywbqriezviyewspboapmjwcfzxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396870.9423754-140-10985918641430/AnsiballZ_command.py'
Nov 29 06:14:31 compute-2 sudo[69032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:31 compute-2 python3.9[69034]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ayjbkyvy' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:31 compute-2 sudo[69032]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:32 compute-2 sudo[69186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edxyntruiseumzkxzjukebipbqnmvfbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396871.8663344-165-55503341675260/AnsiballZ_file.py'
Nov 29 06:14:32 compute-2 sudo[69186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:32 compute-2 python3.9[69188]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ayjbkyvy state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:32 compute-2 sudo[69186]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:32 compute-2 sshd-session[68274]: Connection closed by 192.168.122.30 port 41564
Nov 29 06:14:32 compute-2 sshd-session[68271]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:14:32 compute-2 systemd[1]: session-16.scope: Deactivated successfully.
Nov 29 06:14:32 compute-2 systemd[1]: session-16.scope: Consumed 3.688s CPU time.
Nov 29 06:14:32 compute-2 systemd-logind[784]: Session 16 logged out. Waiting for processes to exit.
Nov 29 06:14:32 compute-2 systemd-logind[784]: Removed session 16.
Nov 29 06:14:39 compute-2 sshd-session[69213]: Accepted publickey for zuul from 192.168.122.30 port 56854 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:14:39 compute-2 systemd-logind[784]: New session 17 of user zuul.
Nov 29 06:14:39 compute-2 systemd[1]: Started Session 17 of User zuul.
Nov 29 06:14:39 compute-2 sshd-session[69213]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:14:40 compute-2 python3.9[69366]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:14:41 compute-2 sudo[69520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilxfoclzoupuncgflvlctqqefrnedfyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396880.5981097-63-238086868672630/AnsiballZ_systemd.py'
Nov 29 06:14:41 compute-2 sudo[69520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:41 compute-2 python3.9[69522]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:14:41 compute-2 sudo[69520]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:42 compute-2 sudo[69674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beknnkldenbqekhztkjwodetaqeyumjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396881.8357382-87-120884408660260/AnsiballZ_systemd.py'
Nov 29 06:14:42 compute-2 sudo[69674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:42 compute-2 python3.9[69676]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:14:42 compute-2 sudo[69674]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:44 compute-2 sudo[69827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkgfbkxwjtanbzebtxxvkohotneziygy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396883.9253302-115-146121962171177/AnsiballZ_command.py'
Nov 29 06:14:44 compute-2 sudo[69827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:44 compute-2 python3.9[69829]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:44 compute-2 sudo[69827]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:45 compute-2 sudo[69980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvdsvqrynthfajskkykbakpyaghjwwoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396884.8274662-139-107283131210738/AnsiballZ_stat.py'
Nov 29 06:14:45 compute-2 sudo[69980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:45 compute-2 python3.9[69982]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:14:45 compute-2 sudo[69980]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:46 compute-2 sudo[70134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmrpycneodxwusxpawejcavteiykiasx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396885.7723937-162-35233092261157/AnsiballZ_command.py'
Nov 29 06:14:46 compute-2 sudo[70134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:46 compute-2 python3.9[70136]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:46 compute-2 sudo[70134]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:47 compute-2 sudo[70289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzstdtnmbsdmiuodhhumnpxnpbjawknc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396886.6112518-186-64700559188204/AnsiballZ_file.py'
Nov 29 06:14:47 compute-2 sudo[70289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:47 compute-2 python3.9[70291]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:47 compute-2 sudo[70289]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:47 compute-2 sshd-session[69216]: Connection closed by 192.168.122.30 port 56854
Nov 29 06:14:47 compute-2 sshd-session[69213]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:14:47 compute-2 systemd[1]: session-17.scope: Deactivated successfully.
Nov 29 06:14:47 compute-2 systemd[1]: session-17.scope: Consumed 4.541s CPU time.
Nov 29 06:14:47 compute-2 systemd-logind[784]: Session 17 logged out. Waiting for processes to exit.
Nov 29 06:14:47 compute-2 systemd-logind[784]: Removed session 17.
Nov 29 06:14:53 compute-2 sshd-session[70316]: Accepted publickey for zuul from 192.168.122.30 port 51720 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:14:53 compute-2 systemd-logind[784]: New session 18 of user zuul.
Nov 29 06:14:53 compute-2 systemd[1]: Started Session 18 of User zuul.
Nov 29 06:14:53 compute-2 sshd-session[70316]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:14:54 compute-2 python3.9[70469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:14:55 compute-2 sudo[70623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knggtpzkgknxbgakvbtjiflbxuofzlom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396894.8700933-69-149108293478495/AnsiballZ_setup.py'
Nov 29 06:14:55 compute-2 sudo[70623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:55 compute-2 python3.9[70625]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:14:55 compute-2 sudo[70623]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:56 compute-2 sudo[70707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rimtruveqhrcaatoqiglgqsasieikeaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396894.8700933-69-149108293478495/AnsiballZ_dnf.py'
Nov 29 06:14:56 compute-2 sudo[70707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:56 compute-2 python3.9[70709]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:14:57 compute-2 sudo[70707]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:58 compute-2 python3.9[70860]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:59 compute-2 python3.9[71011]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:15:00 compute-2 python3.9[71161]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:15:00 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:15:00 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:15:01 compute-2 python3.9[71312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:15:02 compute-2 sshd-session[70319]: Connection closed by 192.168.122.30 port 51720
Nov 29 06:15:02 compute-2 sshd-session[70316]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:15:02 compute-2 systemd[1]: session-18.scope: Deactivated successfully.
Nov 29 06:15:02 compute-2 systemd[1]: session-18.scope: Consumed 5.568s CPU time.
Nov 29 06:15:02 compute-2 systemd-logind[784]: Session 18 logged out. Waiting for processes to exit.
Nov 29 06:15:02 compute-2 systemd-logind[784]: Removed session 18.
Nov 29 06:15:08 compute-2 chronyd[58550]: Selected source 142.4.192.253 (pool.ntp.org)
Nov 29 06:15:11 compute-2 sshd-session[71337]: Accepted publickey for zuul from 38.102.83.107 port 47870 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 06:15:11 compute-2 systemd-logind[784]: New session 19 of user zuul.
Nov 29 06:15:11 compute-2 systemd[1]: Started Session 19 of User zuul.
Nov 29 06:15:11 compute-2 sshd-session[71337]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:15:11 compute-2 sudo[71413]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usqrohlgnjcoxbwhnzhsbjwjfukqqvfq ; /usr/bin/python3'
Nov 29 06:15:11 compute-2 sudo[71413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:12 compute-2 useradd[71417]: new group: name=ceph-admin, GID=42478
Nov 29 06:15:12 compute-2 useradd[71417]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Nov 29 06:15:12 compute-2 sudo[71413]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:12 compute-2 sudo[71499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihvodjylkciugeaqiscptlimtzhuxxpz ; /usr/bin/python3'
Nov 29 06:15:12 compute-2 sudo[71499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:12 compute-2 sudo[71499]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:13 compute-2 sudo[71572]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iucpacrqjjuptzhxqhndbkklyyitwgfn ; /usr/bin/python3'
Nov 29 06:15:13 compute-2 sudo[71572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:13 compute-2 sudo[71572]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:13 compute-2 sudo[71622]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxpugdeusaerjdvvqkjjjkipuesiwhrp ; /usr/bin/python3'
Nov 29 06:15:13 compute-2 sudo[71622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:13 compute-2 sudo[71622]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:14 compute-2 sudo[71648]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mszntppfftzictyczjoxlcgckkkryjzq ; /usr/bin/python3'
Nov 29 06:15:14 compute-2 sudo[71648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:14 compute-2 sudo[71648]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:14 compute-2 sudo[71674]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkcdairsxjpahkyargmpsjgyclibtfqj ; /usr/bin/python3'
Nov 29 06:15:14 compute-2 sudo[71674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:14 compute-2 sudo[71674]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:15 compute-2 sudo[71700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqgbldkmmrdldjlcxkqwrkujxcumnsuf ; /usr/bin/python3'
Nov 29 06:15:15 compute-2 sudo[71700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:15 compute-2 sudo[71700]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:15 compute-2 sudo[71778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpvmvjhcztfbuejbjwxuempftgzhvkqo ; /usr/bin/python3'
Nov 29 06:15:15 compute-2 sudo[71778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:15 compute-2 sudo[71778]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:16 compute-2 sudo[71851]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgozrzmggrqdzyxgiiumvlromtjpicaf ; /usr/bin/python3'
Nov 29 06:15:16 compute-2 sudo[71851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:16 compute-2 sudo[71851]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:16 compute-2 sudo[71953]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plyofzersgqarrnnhytfwddfkzdqgjrb ; /usr/bin/python3'
Nov 29 06:15:16 compute-2 sudo[71953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:16 compute-2 sudo[71953]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:17 compute-2 sudo[72026]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkjraewwvvexpnhacjpyhvcnvfrjqwdj ; /usr/bin/python3'
Nov 29 06:15:17 compute-2 sudo[72026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:17 compute-2 sudo[72026]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:17 compute-2 sudo[72076]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hobggptgvajaawxpwecarysanaofrbod ; /usr/bin/python3'
Nov 29 06:15:17 compute-2 sudo[72076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:18 compute-2 python3[72078]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:15:19 compute-2 sudo[72076]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:19 compute-2 sudo[72171]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htwalwkppqxcjaajtegzfttxfxdhfqfi ; /usr/bin/python3'
Nov 29 06:15:19 compute-2 sudo[72171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:20 compute-2 python3[72173]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 06:15:21 compute-2 sudo[72171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:21 compute-2 sudo[72198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thjahplxtqsvgcknpvjinbjpzbhrkzpl ; /usr/bin/python3'
Nov 29 06:15:21 compute-2 sudo[72198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:21 compute-2 python3[72200]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 06:15:21 compute-2 sudo[72198]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:22 compute-2 sudo[72224]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqbxjwhrahxmcliunjpwvxpqnwmprfgl ; /usr/bin/python3'
Nov 29 06:15:22 compute-2 sudo[72224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:22 compute-2 python3[72226]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:15:22 compute-2 kernel: loop: module loaded
Nov 29 06:15:22 compute-2 kernel: loop3: detected capacity change from 0 to 14680064
Nov 29 06:15:22 compute-2 sudo[72224]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:22 compute-2 sudo[72259]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwciwvwquodcplbszygyjhagzrqacoyo ; /usr/bin/python3'
Nov 29 06:15:22 compute-2 sudo[72259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:22 compute-2 python3[72261]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:15:22 compute-2 lvm[72264]: PV /dev/loop3 not used.
Nov 29 06:15:22 compute-2 lvm[72273]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:15:22 compute-2 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 29 06:15:22 compute-2 lvm[72275]:   1 logical volume(s) in volume group "ceph_vg0" now active
Nov 29 06:15:23 compute-2 sudo[72259]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:23 compute-2 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 29 06:15:24 compute-2 sudo[72351]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lygqvvyewtguquxfvjvsughvqfualbsn ; /usr/bin/python3'
Nov 29 06:15:24 compute-2 sudo[72351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:24 compute-2 python3[72353]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:15:24 compute-2 sudo[72351]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:24 compute-2 sudo[72424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wecskoyeoyyrlccfsmvlvfswdxotdzeu ; /usr/bin/python3'
Nov 29 06:15:24 compute-2 sudo[72424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:24 compute-2 python3[72426]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764396923.9633152-37030-191459489699471/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:15:24 compute-2 sudo[72424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:25 compute-2 sudo[72474]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wezcqfjgedkkbaxbniaojdcrehewpzlt ; /usr/bin/python3'
Nov 29 06:15:25 compute-2 sudo[72474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:25 compute-2 python3[72476]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:15:25 compute-2 systemd[1]: Reloading.
Nov 29 06:15:25 compute-2 systemd-rc-local-generator[72503]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:15:25 compute-2 systemd-sysv-generator[72508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:15:25 compute-2 systemd[1]: Starting Ceph OSD losetup...
Nov 29 06:15:25 compute-2 bash[72516]: /dev/loop3: [64513]:4327940 (/var/lib/ceph-osd-0.img)
Nov 29 06:15:25 compute-2 systemd[1]: Finished Ceph OSD losetup.
Nov 29 06:15:25 compute-2 sudo[72474]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:25 compute-2 lvm[72518]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:15:25 compute-2 lvm[72518]: VG ceph_vg0 finished
Nov 29 06:15:28 compute-2 python3[72542]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:15:34 compute-2 sshd-session[72586]: Invalid user jenkins from 92.118.39.92 port 56518
Nov 29 06:15:34 compute-2 sshd-session[72586]: Connection closed by invalid user jenkins 92.118.39.92 port 56518 [preauth]
Nov 29 06:17:35 compute-2 sshd-session[72589]: Accepted publickey for ceph-admin from 192.168.122.100 port 60168 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:35 compute-2 systemd[1]: Created slice User Slice of UID 42477.
Nov 29 06:17:35 compute-2 systemd[1]: Starting User Runtime Directory /run/user/42477...
Nov 29 06:17:35 compute-2 systemd-logind[784]: New session 20 of user ceph-admin.
Nov 29 06:17:35 compute-2 systemd[1]: Finished User Runtime Directory /run/user/42477.
Nov 29 06:17:35 compute-2 systemd[1]: Starting User Manager for UID 42477...
Nov 29 06:17:35 compute-2 systemd[72593]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:35 compute-2 systemd[72593]: Queued start job for default target Main User Target.
Nov 29 06:17:35 compute-2 sshd-session[72607]: Accepted publickey for ceph-admin from 192.168.122.100 port 60170 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:35 compute-2 systemd[72593]: Created slice User Application Slice.
Nov 29 06:17:35 compute-2 systemd[72593]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 06:17:35 compute-2 systemd[72593]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 06:17:35 compute-2 systemd[72593]: Reached target Paths.
Nov 29 06:17:35 compute-2 systemd[72593]: Reached target Timers.
Nov 29 06:17:35 compute-2 systemd[72593]: Starting D-Bus User Message Bus Socket...
Nov 29 06:17:35 compute-2 systemd[72593]: Starting Create User's Volatile Files and Directories...
Nov 29 06:17:35 compute-2 systemd-logind[784]: New session 22 of user ceph-admin.
Nov 29 06:17:35 compute-2 systemd[72593]: Finished Create User's Volatile Files and Directories.
Nov 29 06:17:35 compute-2 systemd[72593]: Listening on D-Bus User Message Bus Socket.
Nov 29 06:17:35 compute-2 systemd[72593]: Reached target Sockets.
Nov 29 06:17:35 compute-2 systemd[72593]: Reached target Basic System.
Nov 29 06:17:35 compute-2 systemd[72593]: Reached target Main User Target.
Nov 29 06:17:35 compute-2 systemd[72593]: Startup finished in 113ms.
Nov 29 06:17:35 compute-2 systemd[1]: Started User Manager for UID 42477.
Nov 29 06:17:35 compute-2 systemd[1]: Started Session 20 of User ceph-admin.
Nov 29 06:17:35 compute-2 systemd[1]: Started Session 22 of User ceph-admin.
Nov 29 06:17:35 compute-2 sshd-session[72589]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:35 compute-2 sshd-session[72607]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:35 compute-2 sudo[72614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:35 compute-2 sudo[72614]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:35 compute-2 sudo[72614]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-2 sudo[72639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:36 compute-2 sudo[72639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-2 sudo[72639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-2 sshd-session[72664]: Accepted publickey for ceph-admin from 192.168.122.100 port 60172 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:36 compute-2 systemd-logind[784]: New session 23 of user ceph-admin.
Nov 29 06:17:36 compute-2 systemd[1]: Started Session 23 of User ceph-admin.
Nov 29 06:17:36 compute-2 sshd-session[72664]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:36 compute-2 sudo[72668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:36 compute-2 sudo[72668]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-2 sudo[72668]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-2 sudo[72693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host --expect-hostname compute-2
Nov 29 06:17:36 compute-2 sudo[72693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-2 sudo[72693]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-2 sshd-session[72718]: Accepted publickey for ceph-admin from 192.168.122.100 port 60188 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:36 compute-2 systemd-logind[784]: New session 24 of user ceph-admin.
Nov 29 06:17:36 compute-2 systemd[1]: Started Session 24 of User ceph-admin.
Nov 29 06:17:36 compute-2 sshd-session[72718]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:36 compute-2 sudo[72722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:36 compute-2 sudo[72722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-2 sudo[72722]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-2 sudo[72747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d
Nov 29 06:17:36 compute-2 sudo[72747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-2 sudo[72747]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:37 compute-2 sshd-session[72772]: Accepted publickey for ceph-admin from 192.168.122.100 port 60200 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:37 compute-2 systemd-logind[784]: New session 25 of user ceph-admin.
Nov 29 06:17:37 compute-2 systemd[1]: Started Session 25 of User ceph-admin.
Nov 29 06:17:37 compute-2 sshd-session[72772]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:37 compute-2 sudo[72776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:37 compute-2 sudo[72776]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:37 compute-2 sudo[72776]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:37 compute-2 sudo[72801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:37 compute-2 sudo[72801]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:37 compute-2 sudo[72801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:37 compute-2 sshd-session[72826]: Accepted publickey for ceph-admin from 192.168.122.100 port 60214 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:37 compute-2 systemd-logind[784]: New session 26 of user ceph-admin.
Nov 29 06:17:37 compute-2 systemd[1]: Started Session 26 of User ceph-admin.
Nov 29 06:17:37 compute-2 sshd-session[72826]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:37 compute-2 sudo[72830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:37 compute-2 sudo[72830]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:37 compute-2 sudo[72830]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:37 compute-2 sudo[72855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:37 compute-2 sudo[72855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:37 compute-2 sudo[72855]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-2 sshd-session[72880]: Accepted publickey for ceph-admin from 192.168.122.100 port 60230 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:38 compute-2 systemd-logind[784]: New session 27 of user ceph-admin.
Nov 29 06:17:38 compute-2 systemd[1]: Started Session 27 of User ceph-admin.
Nov 29 06:17:38 compute-2 sshd-session[72880]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:38 compute-2 sudo[72884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:38 compute-2 sudo[72884]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-2 sudo[72884]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-2 sudo[72909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new
Nov 29 06:17:38 compute-2 sudo[72909]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-2 sudo[72909]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-2 sshd-session[72934]: Accepted publickey for ceph-admin from 192.168.122.100 port 60240 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:38 compute-2 systemd-logind[784]: New session 28 of user ceph-admin.
Nov 29 06:17:38 compute-2 systemd[1]: Started Session 28 of User ceph-admin.
Nov 29 06:17:38 compute-2 sshd-session[72934]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:38 compute-2 sudo[72938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:38 compute-2 sudo[72938]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-2 sudo[72938]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-2 sudo[72963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:38 compute-2 sudo[72963]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-2 sudo[72963]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-2 sshd-session[72988]: Accepted publickey for ceph-admin from 192.168.122.100 port 60242 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:38 compute-2 systemd-logind[784]: New session 29 of user ceph-admin.
Nov 29 06:17:38 compute-2 systemd[1]: Started Session 29 of User ceph-admin.
Nov 29 06:17:38 compute-2 sshd-session[72988]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:38 compute-2 sudo[72992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:38 compute-2 sudo[72992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-2 sudo[72992]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:39 compute-2 sudo[73017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new
Nov 29 06:17:39 compute-2 sudo[73017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:39 compute-2 sudo[73017]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:39 compute-2 sshd-session[73042]: Accepted publickey for ceph-admin from 192.168.122.100 port 60246 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:39 compute-2 systemd-logind[784]: New session 30 of user ceph-admin.
Nov 29 06:17:39 compute-2 systemd[1]: Started Session 30 of User ceph-admin.
Nov 29 06:17:39 compute-2 sshd-session[73042]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:39 compute-2 sshd-session[73069]: Accepted publickey for ceph-admin from 192.168.122.100 port 60260 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:39 compute-2 systemd-logind[784]: New session 31 of user ceph-admin.
Nov 29 06:17:39 compute-2 systemd[1]: Started Session 31 of User ceph-admin.
Nov 29 06:17:39 compute-2 sshd-session[73069]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:39 compute-2 sudo[73073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:39 compute-2 sudo[73073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:39 compute-2 sudo[73073]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:40 compute-2 sudo[73098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d
Nov 29 06:17:40 compute-2 sudo[73098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:40 compute-2 sudo[73098]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:40 compute-2 sshd-session[73123]: Accepted publickey for ceph-admin from 192.168.122.100 port 60264 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:40 compute-2 systemd-logind[784]: New session 32 of user ceph-admin.
Nov 29 06:17:40 compute-2 systemd[1]: Started Session 32 of User ceph-admin.
Nov 29 06:17:40 compute-2 sshd-session[73123]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:40 compute-2 sudo[73127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:40 compute-2 sudo[73127]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:40 compute-2 sudo[73127]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:40 compute-2 sudo[73152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host --expect-hostname compute-2
Nov 29 06:17:40 compute-2 sudo[73152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:40 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:40 compute-2 sudo[73152]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-2 sshd-session[73197]: Connection closed by authenticating user root 92.118.39.92 port 49952 [preauth]
Nov 29 06:18:37 compute-2 sudo[73199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:37 compute-2 sudo[73199]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:37 compute-2 sudo[73199]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:37 compute-2 sudo[73224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:18:37 compute-2 sudo[73224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:37 compute-2 sudo[73224]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:38 compute-2 sudo[73249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 sudo[73249]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:38 compute-2 sudo[73274]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 sudo[73274]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:38 compute-2 sudo[73299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 sudo[73299]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:18:38 compute-2 sudo[73324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:38 compute-2 sudo[73324]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:38 compute-2 sudo[73368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 sudo[73368]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:38 compute-2 sudo[73393]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 sudo[73393]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:38 compute-2 sudo[73418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 sudo[73418]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-2 sudo[73443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:18:38 compute-2 sudo[73443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:39 compute-2 sudo[73443]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-2 sudo[73504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:39 compute-2 sudo[73504]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:39 compute-2 sudo[73504]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-2 sudo[73529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:39 compute-2 sudo[73529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:39 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:39 compute-2 sudo[73529]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-2 sudo[73554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:39 compute-2 sudo[73554]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:39 compute-2 sudo[73554]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-2 sudo[73579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:18:39 compute-2 sudo[73579]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:39 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:39 compute-2 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73617 (sysctl)
Nov 29 06:18:39 compute-2 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 29 06:18:39 compute-2 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 29 06:18:40 compute-2 sudo[73579]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-2 sudo[73639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:40 compute-2 sudo[73639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-2 sudo[73639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-2 sudo[73664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:40 compute-2 sudo[73664]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-2 sudo[73664]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-2 sudo[73689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:40 compute-2 sudo[73689]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-2 sudo[73689]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-2 sudo[73714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 list-networks
Nov 29 06:18:40 compute-2 sudo[73714]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:40 compute-2 sudo[73714]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-2 sudo[73756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:40 compute-2 sudo[73756]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-2 sudo[73756]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-2 sudo[73781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:40 compute-2 sudo[73781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-2 sudo[73781]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-2 sudo[73806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:40 compute-2 sudo[73806]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-2 sudo[73806]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:41 compute-2 sudo[73831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- inventory --format=json-pretty --filter-for-batch
Nov 29 06:18:41 compute-2 sudo[73831]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:41 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:45 compute-2 systemd[1]: var-lib-containers-storage-overlay-compat804025874-lower\x2dmapped.mount: Deactivated successfully.
Nov 29 06:19:09 compute-2 podman[73893]: 2025-11-29 06:19:09.982071152 +0000 UTC m=+28.714164910 container create 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 06:19:10 compute-2 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 29 06:19:10 compute-2 systemd[1]: Started libpod-conmon-51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b.scope.
Nov 29 06:19:10 compute-2 podman[73893]: 2025-11-29 06:19:09.968078728 +0000 UTC m=+28.700172526 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:10 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:10 compute-2 podman[73893]: 2025-11-29 06:19:10.076261001 +0000 UTC m=+28.808354819 container init 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Nov 29 06:19:10 compute-2 podman[73893]: 2025-11-29 06:19:10.084791342 +0000 UTC m=+28.816885120 container start 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Nov 29 06:19:10 compute-2 podman[73893]: 2025-11-29 06:19:10.088200561 +0000 UTC m=+28.820294379 container attach 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Nov 29 06:19:10 compute-2 recursing_swirles[73955]: 167 167
Nov 29 06:19:10 compute-2 systemd[1]: libpod-51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b.scope: Deactivated successfully.
Nov 29 06:19:10 compute-2 podman[73893]: 2025-11-29 06:19:10.091596909 +0000 UTC m=+28.823690677 container died 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:10 compute-2 systemd[1]: var-lib-containers-storage-overlay-5c0c0d2e07127ddb4aeb2fa9832df785b317e2db297c37e229627d9ea11c1aca-merged.mount: Deactivated successfully.
Nov 29 06:19:10 compute-2 podman[73893]: 2025-11-29 06:19:10.129664939 +0000 UTC m=+28.861758707 container remove 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:10 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:19:10 compute-2 systemd[1]: libpod-conmon-51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b.scope: Deactivated successfully.
Nov 29 06:19:10 compute-2 podman[73980]: 2025-11-29 06:19:10.305765217 +0000 UTC m=+0.052567218 container create 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:10 compute-2 systemd[1]: Started libpod-conmon-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope.
Nov 29 06:19:10 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f7dc80d8b1dac7ac36b089ee3254f9af9877e161a8d97dbd0fc1cf37c2fdb76/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:10 compute-2 podman[73980]: 2025-11-29 06:19:10.280275284 +0000 UTC m=+0.027077305 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f7dc80d8b1dac7ac36b089ee3254f9af9877e161a8d97dbd0fc1cf37c2fdb76/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:10 compute-2 podman[73980]: 2025-11-29 06:19:10.409298299 +0000 UTC m=+0.156100330 container init 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:10 compute-2 podman[73980]: 2025-11-29 06:19:10.420413547 +0000 UTC m=+0.167215548 container start 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Nov 29 06:19:10 compute-2 podman[73980]: 2025-11-29 06:19:10.425222552 +0000 UTC m=+0.172024533 container attach 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:11 compute-2 sweet_wilson[73996]: [
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:     {
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "available": false,
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "ceph_device": false,
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "lsm_data": {},
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "lvs": [],
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "path": "/dev/sr0",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "rejected_reasons": [
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "Insufficient space (<5GB)",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "Has a FileSystem"
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         ],
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         "sys_api": {
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "actuators": null,
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "device_nodes": "sr0",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "devname": "sr0",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "human_readable_size": "482.00 KB",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "id_bus": "ata",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "model": "QEMU DVD-ROM",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "nr_requests": "2",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "parent": "/dev/sr0",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "partitions": {},
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "path": "/dev/sr0",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "removable": "1",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "rev": "2.5+",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "ro": "0",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "rotational": "1",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "sas_address": "",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "sas_device_handle": "",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "scheduler_mode": "mq-deadline",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "sectors": 0,
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "sectorsize": "2048",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "size": 493568.0,
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "support_discard": "2048",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "type": "disk",
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:             "vendor": "QEMU"
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:         }
Nov 29 06:19:11 compute-2 sweet_wilson[73996]:     }
Nov 29 06:19:11 compute-2 sweet_wilson[73996]: ]
Nov 29 06:19:11 compute-2 systemd[1]: libpod-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope: Deactivated successfully.
Nov 29 06:19:11 compute-2 systemd[1]: libpod-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope: Consumed 1.148s CPU time.
Nov 29 06:19:11 compute-2 podman[73980]: 2025-11-29 06:19:11.568396682 +0000 UTC m=+1.315198663 container died 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 06:19:11 compute-2 systemd[1]: var-lib-containers-storage-overlay-6f7dc80d8b1dac7ac36b089ee3254f9af9877e161a8d97dbd0fc1cf37c2fdb76-merged.mount: Deactivated successfully.
Nov 29 06:19:11 compute-2 podman[73980]: 2025-11-29 06:19:11.629961712 +0000 UTC m=+1.376763693 container remove 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef)
Nov 29 06:19:11 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:19:11 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:19:11 compute-2 systemd[1]: libpod-conmon-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope: Deactivated successfully.
Nov 29 06:19:11 compute-2 sudo[73831]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:11 compute-2 sudo[74955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:11 compute-2 sudo[74955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:11 compute-2 sudo[74955]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:11 compute-2 sudo[74980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 29 06:19:11 compute-2 sudo[74980]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:11 compute-2 sudo[74980]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:11 compute-2 sudo[75005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:11 compute-2 sudo[75005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:11 compute-2 sudo[75005]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:11 compute-2 sudo[75030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph
Nov 29 06:19:11 compute-2 sudo[75030]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:11 compute-2 sudo[75030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:11 compute-2 sudo[75055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:11 compute-2 sudo[75055]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:11 compute-2 sudo[75055]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:19:12 compute-2 sudo[75080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75080]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:12 compute-2 sudo[75105]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75105]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:12 compute-2 sudo[75130]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75130]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:12 compute-2 sudo[75155]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75155]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:19:12 compute-2 sudo[75180]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75180]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:12 compute-2 sudo[75228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75228]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:19:12 compute-2 sudo[75253]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75253]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:12 compute-2 sudo[75278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75278]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:19:12 compute-2 sudo[75303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75303]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:12 compute-2 sudo[75328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75328]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 29 06:19:12 compute-2 sudo[75353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:12 compute-2 sudo[75378]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75378]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:19:12 compute-2 sudo[75403]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75403]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:12 compute-2 sudo[75428]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75428]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:12 compute-2 sudo[75453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:19:12 compute-2 sudo[75453]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:12 compute-2 sudo[75453]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75478]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75478]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:19:13 compute-2 sudo[75503]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75503]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75528]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75528]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:13 compute-2 sudo[75553]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75553]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75578]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:19:13 compute-2 sudo[75603]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75603]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75651]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:19:13 compute-2 sudo[75676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75676]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75701]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75701]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:19:13 compute-2 sudo[75726]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75726]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75751]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75751]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:19:13 compute-2 sudo[75776]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75776]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75801]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 29 06:19:13 compute-2 sudo[75826]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75826]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75851]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75851]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph
Nov 29 06:19:13 compute-2 sudo[75876]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75876]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:13 compute-2 sudo[75901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:13 compute-2 sudo[75901]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:13 compute-2 sudo[75901]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[75926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:19:14 compute-2 sudo[75926]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[75926]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[75951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:14 compute-2 sudo[75951]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[75951]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[75976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:14 compute-2 sudo[75976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[75976]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:14 compute-2 sudo[76001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76001]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:19:14 compute-2 sudo[76026]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76026]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:14 compute-2 sudo[76074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76074]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:19:14 compute-2 sudo[76099]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76099]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:14 compute-2 sudo[76124]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76124]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:19:14 compute-2 sudo[76149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76149]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:14 compute-2 sudo[76174]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76174]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 29 06:19:14 compute-2 sudo[76199]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76199]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:14 compute-2 sudo[76224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76224]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:14 compute-2 sudo[76249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:19:14 compute-2 sudo[76249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:14 compute-2 sudo[76249]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:15 compute-2 sudo[76274]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76274]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:19:15 compute-2 sudo[76299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76299]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:15 compute-2 sudo[76324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76324]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:19:15 compute-2 sudo[76349]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76349]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:15 compute-2 sudo[76374]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76374]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76399]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:15 compute-2 sudo[76399]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76399]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:15 compute-2 sudo[76424]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:19:15 compute-2 sudo[76449]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76449]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:15 compute-2 sudo[76497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76497]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:19:15 compute-2 sudo[76522]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76522]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:15 compute-2 sudo[76547]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76547]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:19:15 compute-2 sudo[76572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76572]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:15 compute-2 sudo[76597]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76597]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:15 compute-2 sudo[76622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring
Nov 29 06:19:15 compute-2 sudo[76622]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:15 compute-2 sudo[76622]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:16 compute-2 sudo[76647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:16 compute-2 sudo[76647]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:16 compute-2 sudo[76647]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:16 compute-2 sudo[76672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:16 compute-2 sudo[76672]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:16 compute-2 sudo[76672]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:16 compute-2 sudo[76697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:16 compute-2 sudo[76697]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:16 compute-2 sudo[76697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:16 compute-2 sudo[76722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:16 compute-2 sudo[76722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:16 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:19:16 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:19:16 compute-2 podman[76788]: 2025-11-29 06:19:16.674019362 +0000 UTC m=+0.045522005 container create ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Nov 29 06:19:16 compute-2 systemd[1]: Started libpod-conmon-ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd.scope.
Nov 29 06:19:16 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:16 compute-2 podman[76788]: 2025-11-29 06:19:16.733274262 +0000 UTC m=+0.104776915 container init ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:16 compute-2 podman[76788]: 2025-11-29 06:19:16.741303311 +0000 UTC m=+0.112805974 container start ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 06:19:16 compute-2 podman[76788]: 2025-11-29 06:19:16.647233545 +0000 UTC m=+0.018736228 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:16 compute-2 podman[76788]: 2025-11-29 06:19:16.745577662 +0000 UTC m=+0.117080375 container attach ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:16 compute-2 intelligent_nightingale[76804]: 167 167
Nov 29 06:19:16 compute-2 systemd[1]: libpod-ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd.scope: Deactivated successfully.
Nov 29 06:19:16 compute-2 podman[76788]: 2025-11-29 06:19:16.748790136 +0000 UTC m=+0.120292809 container died ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:16 compute-2 podman[76788]: 2025-11-29 06:19:16.783861117 +0000 UTC m=+0.155363770 container remove ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:16 compute-2 systemd[1]: libpod-conmon-ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd.scope: Deactivated successfully.
Nov 29 06:19:16 compute-2 podman[76824]: 2025-11-29 06:19:16.847458251 +0000 UTC m=+0.037959908 container create 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 06:19:16 compute-2 systemd[1]: Started libpod-conmon-4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50.scope.
Nov 29 06:19:16 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:16 compute-2 podman[76824]: 2025-11-29 06:19:16.903007035 +0000 UTC m=+0.093508712 container init 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:16 compute-2 podman[76824]: 2025-11-29 06:19:16.910714135 +0000 UTC m=+0.101215822 container start 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Nov 29 06:19:16 compute-2 podman[76824]: 2025-11-29 06:19:16.914300378 +0000 UTC m=+0.104802045 container attach 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:16 compute-2 podman[76824]: 2025-11-29 06:19:16.830007447 +0000 UTC m=+0.020509124 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:16 compute-2 systemd[1]: libpod-4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50.scope: Deactivated successfully.
Nov 29 06:19:17 compute-2 podman[76866]: 2025-11-29 06:19:17.034228446 +0000 UTC m=+0.021924041 container died 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:19:17 compute-2 podman[76866]: 2025-11-29 06:19:17.066011832 +0000 UTC m=+0.053707417 container remove 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:17 compute-2 systemd[1]: libpod-conmon-4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50.scope: Deactivated successfully.
Nov 29 06:19:17 compute-2 systemd[1]: Reloading.
Nov 29 06:19:17 compute-2 systemd-rc-local-generator[76911]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:17 compute-2 systemd-sysv-generator[76914]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:17 compute-2 systemd[1]: Reloading.
Nov 29 06:19:17 compute-2 systemd-rc-local-generator[76946]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:17 compute-2 systemd-sysv-generator[76950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:17 compute-2 systemd[1]: Reached target All Ceph clusters and services.
Nov 29 06:19:17 compute-2 systemd[1]: Reloading.
Nov 29 06:19:17 compute-2 systemd-rc-local-generator[76984]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:17 compute-2 systemd-sysv-generator[76988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:17 compute-2 systemd[1]: Reached target Ceph cluster 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:17 compute-2 systemd[1]: Reloading.
Nov 29 06:19:17 compute-2 systemd-sysv-generator[77023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:17 compute-2 systemd-rc-local-generator[77019]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:18 compute-2 systemd[1]: Reloading.
Nov 29 06:19:18 compute-2 systemd-rc-local-generator[77064]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:18 compute-2 systemd-sysv-generator[77068]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:18 compute-2 systemd[1]: Created slice Slice /system/ceph-336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:18 compute-2 systemd[1]: Reached target System Time Set.
Nov 29 06:19:18 compute-2 systemd[1]: Reached target System Time Synchronized.
Nov 29 06:19:18 compute-2 systemd[1]: Starting Ceph mon.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:19:18 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:19:18 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:19:18 compute-2 podman[77122]: 2025-11-29 06:19:18.552103687 +0000 UTC m=+0.036385177 container create 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Nov 29 06:19:18 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d31412971db94b5a28cc98a3068e2335e13b119df5a46bfd577c8c751af35ed6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:18 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d31412971db94b5a28cc98a3068e2335e13b119df5a46bfd577c8c751af35ed6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:18 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d31412971db94b5a28cc98a3068e2335e13b119df5a46bfd577c8c751af35ed6/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:18 compute-2 podman[77122]: 2025-11-29 06:19:18.605289599 +0000 UTC m=+0.089571109 container init 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:18 compute-2 podman[77122]: 2025-11-29 06:19:18.60992894 +0000 UTC m=+0.094210430 container start 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:18 compute-2 bash[77122]: 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1
Nov 29 06:19:18 compute-2 podman[77122]: 2025-11-29 06:19:18.536577123 +0000 UTC m=+0.020858643 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:18 compute-2 systemd[1]: Started Ceph mon.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:18 compute-2 ceph-mon[77142]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pidfile_write: ignore empty --pid-file
Nov 29 06:19:18 compute-2 sudo[76722]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:18 compute-2 ceph-mon[77142]: load: jerasure load: lrc 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: RocksDB version: 7.9.2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Git sha 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: DB SUMMARY
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: DB Session ID:  VR5455MVOXQY2YZBKO9G
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: CURRENT file:  CURRENT
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                         Options.error_if_exists: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                       Options.create_if_missing: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                                     Options.env: 0x55be8793bc40
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                                Options.info_log: 0x55be896fafc0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                              Options.statistics: (nil)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                               Options.use_fsync: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                              Options.db_log_dir: 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                                 Options.wal_dir: 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                    Options.write_buffer_manager: 0x55be8970ab40
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.unordered_write: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                               Options.row_cache: None
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                              Options.wal_filter: None
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.two_write_queues: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.wal_compression: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.atomic_flush: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.max_background_jobs: 2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.max_background_compactions: -1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.max_subcompactions: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.max_total_wal_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                          Options.max_open_files: -1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:       Options.compaction_readahead_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Compression algorithms supported:
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kZSTD supported: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kXpressCompression supported: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kBZip2Compression supported: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kLZ4Compression supported: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kZlibCompression supported: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kLZ4HCCompression supported: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         kSnappyCompression supported: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:           Options.merge_operator: 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55be896fac00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55be896f31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:        Options.write_buffer_size: 33554432
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:  Options.max_write_buffer_number: 2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:          Options.compression: NoCompression
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 70291dfa-fb4b-4030-8b2f-275b626805e0
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397158651407, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397158653206, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397158653305, "job": 1, "event": "recovery_finished"}
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55be8971ce00
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: DB pointer 0x55be897a6000
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:19:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.64 KB,0.00012219%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:19:18 compute-2 ceph-mon[77142]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Nov 29 06:19:18 compute-2 ceph-mon[77142]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(???) e0 preinit fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).mds e1 new map
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).mds e1 print_map
                                           e1
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 3314933000852226048, adjusting msgr requires
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e13: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v66: 2 pgs: 1 unknown, 1 creating+peering; 0 B data, 853 MiB used, 13 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mgrmap e8: compute-0.vxabpq(active, since 2m)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e14: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v68: 3 pgs: 2 unknown, 1 creating+peering; 0 B data, 453 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e15: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e16: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v71: 4 pgs: 2 active+clean, 2 unknown; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e17: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e18: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v74: 36 pgs: 2 active+clean, 34 unknown; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e19: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.1 deep-scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.1 deep-scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e20: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v77: 37 pgs: 1 unknown, 1 creating+peering, 35 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.2 scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.2 scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v78: 37 pgs: 1 unknown, 1 creating+peering, 35 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.3 scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.3 scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e21: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.4 scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.4 scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e22: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v81: 38 pgs: 1 unknown, 37 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.5 scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.5 scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e23: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.6 deep-scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.6 deep-scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v83: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.7 scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.7 scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e24: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e25: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v86: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e26: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.8 scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.8 scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.e scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.e scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v88: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.b scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.b scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e27: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v90: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e28: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v92: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v93: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: pgmap v94: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Deploying daemon mon.compute-2 on compute-2
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Health check cleared: CEPHADM_REFRESH_FAILED (was: failed to probe daemons or devices)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Nov 29 06:19:18 compute-2 ceph-mon[77142]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Nov 29 06:19:18 compute-2 ceph-mon[77142]: osdmap e29: 2 total, 2 up, 2 in
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.a scrub starts
Nov 29 06:19:18 compute-2 ceph-mon[77142]: 2.a scrub ok
Nov 29 06:19:18 compute-2 ceph-mon[77142]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Nov 29 06:19:20 compute-2 ceph-mon[77142]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Nov 29 06:19:20 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 06:19:20 compute-2 ceph-mon[77142]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 29 06:19:20 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:21 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 29 06:19:22 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 29 06:19:23 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 29 06:19:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 29 06:19:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:23 compute-2 ceph-mon[77142]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2025-11-29T06:19:16.947562Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,os=Linux}
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.f deep-scrub starts
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.f deep-scrub ok
Nov 29 06:19:24 compute-2 ceph-mon[77142]: Deploying daemon mon.compute-1 on compute-1
Nov 29 06:19:24 compute-2 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: pgmap v97: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.11 scrub starts
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.11 scrub ok
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: pgmap v98: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.12 scrub starts
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.12 scrub ok
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.c scrub starts
Nov 29 06:19:24 compute-2 ceph-mon[77142]: 2.c scrub ok
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:24 compute-2 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Nov 29 06:19:24 compute-2 ceph-mon[77142]: monmap e2: 2 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:19:24 compute-2 ceph-mon[77142]: fsmap 
Nov 29 06:19:24 compute-2 ceph-mon[77142]: osdmap e29: 2 total, 2 up, 2 in
Nov 29 06:19:24 compute-2 ceph-mon[77142]: mgrmap e8: compute-0.vxabpq(active, since 2m)
Nov 29 06:19:24 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:19:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 29 06:19:24 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 06:19:24 compute-2 ceph-mon[77142]: paxos.1).electionLogic(10) init, last seen epoch 10
Nov 29 06:19:24 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:24 compute-2 sudo[77181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:24 compute-2 sudo[77181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:24 compute-2 sudo[77181]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:24 compute-2 sudo[77206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:24 compute-2 sudo[77206]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:24 compute-2 sudo[77206]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:24 compute-2 sudo[77231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:24 compute-2 sudo[77231]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:24 compute-2 sudo[77231]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:24 compute-2 sudo[77256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:24 compute-2 sudo[77256]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:24 compute-2 podman[77322]: 2025-11-29 06:19:24.658891393 +0000 UTC m=+0.036760066 container create cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:24 compute-2 systemd[1]: Started libpod-conmon-cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539.scope.
Nov 29 06:19:24 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:24 compute-2 podman[77322]: 2025-11-29 06:19:24.733429501 +0000 UTC m=+0.111298204 container init cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 06:19:24 compute-2 podman[77322]: 2025-11-29 06:19:24.642750854 +0000 UTC m=+0.020619537 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:24 compute-2 podman[77322]: 2025-11-29 06:19:24.741222414 +0000 UTC m=+0.119091087 container start cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:24 compute-2 podman[77322]: 2025-11-29 06:19:24.744338095 +0000 UTC m=+0.122206768 container attach cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:24 compute-2 relaxed_euler[77338]: 167 167
Nov 29 06:19:24 compute-2 systemd[1]: libpod-cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539.scope: Deactivated successfully.
Nov 29 06:19:24 compute-2 podman[77322]: 2025-11-29 06:19:24.747170268 +0000 UTC m=+0.125038951 container died cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 06:19:24 compute-2 systemd[1]: var-lib-containers-storage-overlay-03093a32d0d5dfba5098240b29cd21f28a2982381d0308b4235c14c7d57fa0de-merged.mount: Deactivated successfully.
Nov 29 06:19:24 compute-2 podman[77322]: 2025-11-29 06:19:24.780068164 +0000 UTC m=+0.157936867 container remove cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Nov 29 06:19:24 compute-2 systemd[1]: libpod-conmon-cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539.scope: Deactivated successfully.
Nov 29 06:19:24 compute-2 systemd[1]: Reloading.
Nov 29 06:19:24 compute-2 systemd-rc-local-generator[77381]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:24 compute-2 systemd-sysv-generator[77384]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:25 compute-2 systemd[1]: Reloading.
Nov 29 06:19:25 compute-2 systemd-rc-local-generator[77425]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:25 compute-2 systemd-sysv-generator[77429]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:25 compute-2 systemd[1]: Starting Ceph mgr.compute-2.ngsyhe for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:19:25 compute-2 podman[77485]: 2025-11-29 06:19:25.483264035 +0000 UTC m=+0.019501458 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:25 compute-2 podman[77485]: 2025-11-29 06:19:25.980363908 +0000 UTC m=+0.516601301 container create 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Nov 29 06:19:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/var/lib/ceph/mgr/ceph-compute-2.ngsyhe supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:26 compute-2 podman[77485]: 2025-11-29 06:19:26.096892717 +0000 UTC m=+0.633130130 container init 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:26 compute-2 podman[77485]: 2025-11-29 06:19:26.102288637 +0000 UTC m=+0.638526030 container start 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:26 compute-2 bash[77485]: 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413
Nov 29 06:19:26 compute-2 systemd[1]: Started Ceph mgr.compute-2.ngsyhe for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:26 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:26 compute-2 sudo[77256]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:26 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:26 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:27 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:29 compute-2 ceph-mon[77142]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:29 compute-2 ceph-mgr[77504]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:19:29 compute-2 ceph-mgr[77504]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Nov 29 06:19:29 compute-2 ceph-mgr[77504]: pidfile_write: ignore empty --pid-file
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:29 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'alerts'
Nov 29 06:19:29 compute-2 ceph-mon[77142]: pgmap v99: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-2 ceph-mon[77142]: Deploying daemon mgr.compute-2.ngsyhe on compute-2
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.14 scrub starts
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.14 scrub ok
Nov 29 06:19:29 compute-2 ceph-mon[77142]: pgmap v100: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.d scrub starts
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.d scrub ok
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.16 scrub starts
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.16 scrub ok
Nov 29 06:19:29 compute-2 ceph-mon[77142]: pgmap v101: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.17 scrub starts
Nov 29 06:19:29 compute-2 ceph-mon[77142]: 2.17 scrub ok
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:19:29 compute-2 ceph-mon[77142]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:19:29 compute-2 ceph-mon[77142]: fsmap 
Nov 29 06:19:29 compute-2 ceph-mon[77142]: osdmap e29: 2 total, 2 up, 2 in
Nov 29 06:19:29 compute-2 ceph-mon[77142]: mgrmap e8: compute-0.vxabpq(active, since 2m)
Nov 29 06:19:29 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:19:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Nov 29 06:19:29 compute-2 ceph-mgr[77504]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 06:19:29 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'balancer'
Nov 29 06:19:29 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:29.813+0000 7f62940bd140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 06:19:30 compute-2 ceph-mgr[77504]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 06:19:30 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'cephadm'
Nov 29 06:19:30 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:30.090+0000 7f62940bd140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:30 compute-2 ceph-mon[77142]: Deploying daemon mgr.compute-1.gaxpay on compute-1
Nov 29 06:19:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:30 compute-2 ceph-mon[77142]: 2.18 scrub starts
Nov 29 06:19:30 compute-2 ceph-mon[77142]: 2.18 scrub ok
Nov 29 06:19:31 compute-2 ceph-mon[77142]: pgmap v102: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:31 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2714267067' entity='client.admin' 
Nov 29 06:19:31 compute-2 ceph-mon[77142]: 2.1a deep-scrub starts
Nov 29 06:19:31 compute-2 ceph-mon[77142]: 2.1a deep-scrub ok
Nov 29 06:19:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:32 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'crash'
Nov 29 06:19:32 compute-2 ceph-mgr[77504]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 06:19:32 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'dashboard'
Nov 29 06:19:32 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:32.385+0000 7f62940bd140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 06:19:32 compute-2 sudo[77540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:32 compute-2 sudo[77540]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:32 compute-2 sudo[77540]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:32 compute-2 sudo[77565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:32 compute-2 sudo[77565]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:32 compute-2 sudo[77565]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:32 compute-2 sudo[77590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:32 compute-2 sudo[77590]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:32 compute-2 sudo[77590]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:32 compute-2 ceph-mon[77142]: 2.1e scrub starts
Nov 29 06:19:32 compute-2 ceph-mon[77142]: 2.1e scrub ok
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-2 ceph-mon[77142]: pgmap v103: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='client.14256 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:19:32 compute-2 ceph-mon[77142]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-2 ceph-mon[77142]: Saving service ingress.rgw.default spec with placement count:2
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Nov 29 06:19:32 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:32 compute-2 ceph-mon[77142]: Deploying daemon crash.compute-2 on compute-2
Nov 29 06:19:32 compute-2 sudo[77615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:32 compute-2 sudo[77615]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:33 compute-2 podman[77681]: 2025-11-29 06:19:33.185323283 +0000 UTC m=+0.045268927 container create e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:33 compute-2 systemd[1]: Started libpod-conmon-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope.
Nov 29 06:19:33 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:33 compute-2 podman[77681]: 2025-11-29 06:19:33.167924978 +0000 UTC m=+0.027870642 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:33 compute-2 podman[77681]: 2025-11-29 06:19:33.270554866 +0000 UTC m=+0.130500560 container init e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:33 compute-2 podman[77681]: 2025-11-29 06:19:33.277289903 +0000 UTC m=+0.137235567 container start e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Nov 29 06:19:33 compute-2 podman[77681]: 2025-11-29 06:19:33.281987426 +0000 UTC m=+0.141933070 container attach e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 06:19:33 compute-2 nervous_yonath[77697]: 167 167
Nov 29 06:19:33 compute-2 systemd[1]: libpod-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope: Deactivated successfully.
Nov 29 06:19:33 compute-2 conmon[77697]: conmon e4128caa21807f2bbfcd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope/container/memory.events
Nov 29 06:19:33 compute-2 podman[77681]: 2025-11-29 06:19:33.288480316 +0000 UTC m=+0.148425970 container died e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:33 compute-2 systemd[1]: var-lib-containers-storage-overlay-3eb33ebd42ed44668431591fb691b15dc01d80f896bb93ac2acd6f5b4cdc7f19-merged.mount: Deactivated successfully.
Nov 29 06:19:33 compute-2 podman[77681]: 2025-11-29 06:19:33.336018682 +0000 UTC m=+0.195964326 container remove e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:33 compute-2 systemd[1]: libpod-conmon-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope: Deactivated successfully.
Nov 29 06:19:33 compute-2 systemd[1]: Reloading.
Nov 29 06:19:33 compute-2 systemd-rc-local-generator[77741]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:33 compute-2 systemd-sysv-generator[77744]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:33 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e29 _set_new_cache_sizes cache_size:1019926139 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:33 compute-2 systemd[1]: Reloading.
Nov 29 06:19:33 compute-2 systemd-rc-local-generator[77783]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:33 compute-2 ceph-mon[77142]: 2.1f scrub starts
Nov 29 06:19:33 compute-2 ceph-mon[77142]: 2.1f scrub ok
Nov 29 06:19:33 compute-2 systemd-sysv-generator[77788]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:33 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'devicehealth'
Nov 29 06:19:33 compute-2 systemd[1]: Starting Ceph crash.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:19:34 compute-2 podman[77843]: 2025-11-29 06:19:34.118001388 +0000 UTC m=+0.032777780 container create 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:34.142+0000 7f62940bd140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 06:19:34 compute-2 ceph-mgr[77504]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 06:19:34 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'diskprediction_local'
Nov 29 06:19:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:34 compute-2 podman[77843]: 2025-11-29 06:19:34.163370237 +0000 UTC m=+0.078146629 container init 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 06:19:34 compute-2 podman[77843]: 2025-11-29 06:19:34.170191765 +0000 UTC m=+0.084968157 container start 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3)
Nov 29 06:19:34 compute-2 bash[77843]: 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49
Nov 29 06:19:34 compute-2 podman[77843]: 2025-11-29 06:19:34.104027672 +0000 UTC m=+0.018804084 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:34 compute-2 systemd[1]: Started Ceph crash.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:34 compute-2 sudo[77615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 29 06:19:34 compute-2 sudo[77864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:34 compute-2 sudo[77864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-2 sudo[77864]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:34 compute-2 sudo[77891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:34 compute-2 sudo[77891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-2 sudo[77891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:34 compute-2 sudo[77916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:34 compute-2 sudo[77916]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-2 sudo[77916]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.548+0000 7f718b01f640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.548+0000 7f718b01f640 -1 AuthRegistry(0x7f71840675b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.549+0000 7f718b01f640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.549+0000 7f718b01f640 -1 AuthRegistry(0x7f718b01e000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.550+0000 7f7189595640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.552+0000 7f7188d94640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.553+0000 7f7183fff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.554+0000 7f718b01f640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 29 06:19:34 compute-2 sudo[77941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Nov 29 06:19:34 compute-2 sudo[77941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]:   from numpy import show_config as show_numpy_config
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:34.690+0000 7f62940bd140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 06:19:34 compute-2 ceph-mgr[77504]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 06:19:34 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'influx'
Nov 29 06:19:34 compute-2 podman[78013]: 2025-11-29 06:19:34.866296253 +0000 UTC m=+0.046958512 container create 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:34 compute-2 systemd[1]: Started libpod-conmon-30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca.scope.
Nov 29 06:19:34 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:34 compute-2 ceph-mgr[77504]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 06:19:34 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:34.935+0000 7f62940bd140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 06:19:34 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'insights'
Nov 29 06:19:34 compute-2 podman[78013]: 2025-11-29 06:19:34.839911281 +0000 UTC m=+0.020573560 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:34 compute-2 podman[78013]: 2025-11-29 06:19:34.948872326 +0000 UTC m=+0.129534605 container init 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:34 compute-2 podman[78013]: 2025-11-29 06:19:34.955576002 +0000 UTC m=+0.136238271 container start 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Nov 29 06:19:34 compute-2 podman[78013]: 2025-11-29 06:19:34.95931339 +0000 UTC m=+0.139975679 container attach 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Nov 29 06:19:34 compute-2 nifty_cohen[78030]: 167 167
Nov 29 06:19:34 compute-2 systemd[1]: libpod-30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca.scope: Deactivated successfully.
Nov 29 06:19:34 compute-2 podman[78013]: 2025-11-29 06:19:34.961821815 +0000 UTC m=+0.142484074 container died 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 06:19:34 compute-2 systemd[1]: var-lib-containers-storage-overlay-728cd6d1d812d51298ea65e05e41acbdd2baecb9eb9c560c79580fd2e99f8bd2-merged.mount: Deactivated successfully.
Nov 29 06:19:34 compute-2 podman[78013]: 2025-11-29 06:19:34.996434872 +0000 UTC m=+0.177097131 container remove 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 06:19:35 compute-2 systemd[1]: libpod-conmon-30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca.scope: Deactivated successfully.
Nov 29 06:19:35 compute-2 podman[78055]: 2025-11-29 06:19:35.150674503 +0000 UTC m=+0.044664501 container create 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Nov 29 06:19:35 compute-2 systemd[1]: Started libpod-conmon-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope.
Nov 29 06:19:35 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'iostat'
Nov 29 06:19:35 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-2 podman[78055]: 2025-11-29 06:19:35.133252257 +0000 UTC m=+0.027242255 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:35 compute-2 podman[78055]: 2025-11-29 06:19:35.235399773 +0000 UTC m=+0.129389781 container init 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:35 compute-2 podman[78055]: 2025-11-29 06:19:35.241139233 +0000 UTC m=+0.135129231 container start 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:19:35 compute-2 podman[78055]: 2025-11-29 06:19:35.244979004 +0000 UTC m=+0.138969052 container attach 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Nov 29 06:19:35 compute-2 ceph-mon[77142]: pgmap v104: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-2 ceph-mon[77142]: 2.10 scrub starts
Nov 29 06:19:35 compute-2 ceph-mon[77142]: 2.10 scrub ok
Nov 29 06:19:35 compute-2 ceph-mgr[77504]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 06:19:35 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:35.436+0000 7f62940bd140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 06:19:35 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'k8sevents'
Nov 29 06:19:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e2 new map
Nov 29 06:19:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e2 print_map
                                           e2
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:19:35.589013+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                            
                                            
Nov 29 06:19:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e30 e30: 2 total, 2 up, 2 in
Nov 29 06:19:36 compute-2 brave_noether[78071]: --> passed data devices: 0 physical, 1 LVM
Nov 29 06:19:36 compute-2 brave_noether[78071]: --> relative data size: 1.0
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f86a06f9-a09f-46de-8440-929a842d2c66
Nov 29 06:19:36 compute-2 ceph-mon[77142]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:19:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Nov 29 06:19:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Nov 29 06:19:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Nov 29 06:19:36 compute-2 ceph-mon[77142]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Nov 29 06:19:36 compute-2 ceph-mon[77142]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Nov 29 06:19:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Nov 29 06:19:36 compute-2 ceph-mon[77142]: osdmap e30: 2 total, 2 up, 2 in
Nov 29 06:19:36 compute-2 ceph-mon[77142]: fsmap cephfs:0
Nov 29 06:19:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"} v 0) v1
Nov 29 06:19:36 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2624547066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 06:19:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Nov 29 06:19:36 compute-2 lvm[78119]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:19:36 compute-2 lvm[78119]: VG ceph_vg0 finished
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:36 compute-2 brave_noether[78071]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Nov 29 06:19:36 compute-2 systemd[72593]: Starting Mark boot as successful...
Nov 29 06:19:36 compute-2 systemd[72593]: Finished Mark boot as successful.
Nov 29 06:19:37 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Nov 29 06:19:37 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2894938433' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Nov 29 06:19:37 compute-2 brave_noether[78071]:  stderr: got monmap epoch 3
Nov 29 06:19:37 compute-2 brave_noether[78071]: --> Creating keyring file for osd.2
Nov 29 06:19:37 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Nov 29 06:19:37 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Nov 29 06:19:37 compute-2 brave_noether[78071]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid f86a06f9-a09f-46de-8440-929a842d2c66 --setuser ceph --setgroup ceph
Nov 29 06:19:37 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'localpool'
Nov 29 06:19:37 compute-2 ceph-mon[77142]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 06:19:37 compute-2 ceph-mon[77142]: pgmap v106: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:37 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2624547066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 06:19:37 compute-2 ceph-mon[77142]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 06:19:37 compute-2 ceph-mon[77142]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]': finished
Nov 29 06:19:37 compute-2 ceph-mon[77142]: osdmap e31: 3 total, 2 up, 3 in
Nov 29 06:19:37 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:37 compute-2 ceph-mon[77142]: 2.1c scrub starts
Nov 29 06:19:37 compute-2 ceph-mon[77142]: 2.1c scrub ok
Nov 29 06:19:37 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:37 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2894938433' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Nov 29 06:19:37 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'mds_autoscaler'
Nov 29 06:19:38 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'mirroring'
Nov 29 06:19:38 compute-2 ceph-mon[77142]: from='client.14268 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:19:38 compute-2 ceph-mon[77142]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 06:19:38 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'nfs'
Nov 29 06:19:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020053029 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:39 compute-2 ceph-mgr[77504]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 06:19:39 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'orchestrator'
Nov 29 06:19:39 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:39.409+0000 7f62940bd140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 06:19:39 compute-2 brave_noether[78071]:  stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 06:19:39 compute-2 brave_noether[78071]:  stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 06:19:39 compute-2 brave_noether[78071]:  stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 06:19:39 compute-2 brave_noether[78071]:  stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Nov 29 06:19:39 compute-2 brave_noether[78071]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 29 06:19:39 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 06:19:39 compute-2 brave_noether[78071]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Nov 29 06:19:39 compute-2 brave_noether[78071]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:39 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:39 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:19:39 compute-2 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 06:19:39 compute-2 brave_noether[78071]: --> ceph-volume lvm activate successful for osd ID: 2
Nov 29 06:19:39 compute-2 brave_noether[78071]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 29 06:19:39 compute-2 systemd[1]: libpod-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope: Deactivated successfully.
Nov 29 06:19:39 compute-2 systemd[1]: libpod-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope: Consumed 2.524s CPU time.
Nov 29 06:19:39 compute-2 podman[78055]: 2025-11-29 06:19:39.609344326 +0000 UTC m=+4.503334324 container died 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 06:19:40 compute-2 ceph-mon[77142]: pgmap v108: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'osd_perf_query'
Nov 29 06:19:40 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.113+0000 7f62940bd140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'osd_support'
Nov 29 06:19:40 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.392+0000 7f62940bd140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 systemd[1]: var-lib-containers-storage-overlay-b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe-merged.mount: Deactivated successfully.
Nov 29 06:19:40 compute-2 podman[78055]: 2025-11-29 06:19:40.638647182 +0000 UTC m=+5.532637180 container remove 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'pg_autoscaler'
Nov 29 06:19:40 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.652+0000 7f62940bd140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 sudo[77941]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:40 compute-2 systemd[1]: libpod-conmon-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope: Deactivated successfully.
Nov 29 06:19:40 compute-2 sudo[79036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:40 compute-2 sudo[79036]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:40 compute-2 sudo[79036]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:40 compute-2 sudo[79061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:40 compute-2 sudo[79061]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:40 compute-2 sudo[79061]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:40 compute-2 ceph-mon[77142]: 2.9 scrub starts
Nov 29 06:19:40 compute-2 ceph-mon[77142]: 2.9 scrub ok
Nov 29 06:19:40 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Nov 29 06:19:40 compute-2 ceph-mon[77142]: pgmap v109: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:40 compute-2 ceph-mon[77142]: 2.1d scrub starts
Nov 29 06:19:40 compute-2 ceph-mon[77142]: 2.1d scrub ok
Nov 29 06:19:40 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Nov 29 06:19:40 compute-2 sudo[79086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:40 compute-2 sudo[79086]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:40 compute-2 sudo[79086]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:40 compute-2 sudo[79111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- lvm list --format json
Nov 29 06:19:40 compute-2 sudo[79111]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.962+0000 7f62940bd140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 06:19:40 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'progress'
Nov 29 06:19:41 compute-2 podman[79176]: 2025-11-29 06:19:41.224181662 +0000 UTC m=+0.039725832 container create 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Nov 29 06:19:41 compute-2 ceph-mgr[77504]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 06:19:41 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'prometheus'
Nov 29 06:19:41 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:41.229+0000 7f62940bd140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 06:19:41 compute-2 systemd[1]: Started libpod-conmon-904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb.scope.
Nov 29 06:19:41 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:41 compute-2 podman[79176]: 2025-11-29 06:19:41.293617531 +0000 UTC m=+0.109161721 container init 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Nov 29 06:19:41 compute-2 podman[79176]: 2025-11-29 06:19:41.300226054 +0000 UTC m=+0.115770224 container start 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:41 compute-2 podman[79176]: 2025-11-29 06:19:41.20655071 +0000 UTC m=+0.022094880 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:41 compute-2 podman[79176]: 2025-11-29 06:19:41.30503686 +0000 UTC m=+0.120581030 container attach 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Nov 29 06:19:41 compute-2 gracious_goldberg[79192]: 167 167
Nov 29 06:19:41 compute-2 systemd[1]: libpod-904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb.scope: Deactivated successfully.
Nov 29 06:19:41 compute-2 podman[79176]: 2025-11-29 06:19:41.307258138 +0000 UTC m=+0.122802308 container died 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:41 compute-2 systemd[1]: var-lib-containers-storage-overlay-90eca51c6f16e2253e2894b7c47d6e8603267d9c96cd5d31e117e9877448823b-merged.mount: Deactivated successfully.
Nov 29 06:19:41 compute-2 podman[79176]: 2025-11-29 06:19:41.344046062 +0000 UTC m=+0.159590242 container remove 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:41 compute-2 systemd[1]: libpod-conmon-904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb.scope: Deactivated successfully.
Nov 29 06:19:41 compute-2 podman[79218]: 2025-11-29 06:19:41.494510794 +0000 UTC m=+0.040714187 container create 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 06:19:41 compute-2 systemd[1]: Started libpod-conmon-5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376.scope.
Nov 29 06:19:41 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:41 compute-2 podman[79218]: 2025-11-29 06:19:41.47715413 +0000 UTC m=+0.023357533 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:41 compute-2 podman[79218]: 2025-11-29 06:19:41.582613853 +0000 UTC m=+0.128817256 container init 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Nov 29 06:19:41 compute-2 podman[79218]: 2025-11-29 06:19:41.590794177 +0000 UTC m=+0.136997580 container start 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 06:19:41 compute-2 podman[79218]: 2025-11-29 06:19:41.595178982 +0000 UTC m=+0.141382385 container attach 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Nov 29 06:19:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-2 ceph-mgr[77504]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 06:19:42 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'rbd_support'
Nov 29 06:19:42 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:42.340+0000 7f62940bd140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 06:19:42 compute-2 boring_driscoll[79234]: {
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:     "2": [
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:         {
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "devices": [
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "/dev/loop3"
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             ],
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "lv_name": "ceph_lv0",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "lv_size": "7511998464",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=QZmYMa-OSGs-30so-3STC-BZF6-ZfIW-V0Wtxa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=336ec58c-893b-528f-a0c1-6ed1196bc047,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f86a06f9-a09f-46de-8440-929a842d2c66,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "lv_uuid": "QZmYMa-OSGs-30so-3STC-BZF6-ZfIW-V0Wtxa",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "name": "ceph_lv0",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "tags": {
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.block_uuid": "QZmYMa-OSGs-30so-3STC-BZF6-ZfIW-V0Wtxa",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.cephx_lockbox_secret": "",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.cluster_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.cluster_name": "ceph",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.crush_device_class": "",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.encrypted": "0",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.osd_fsid": "f86a06f9-a09f-46de-8440-929a842d2c66",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.osd_id": "2",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.type": "block",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:                 "ceph.vdo": "0"
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             },
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "type": "block",
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:             "vg_name": "ceph_vg0"
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:         }
Nov 29 06:19:42 compute-2 boring_driscoll[79234]:     ]
Nov 29 06:19:42 compute-2 boring_driscoll[79234]: }
Nov 29 06:19:42 compute-2 systemd[1]: libpod-5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376.scope: Deactivated successfully.
Nov 29 06:19:42 compute-2 podman[79218]: 2025-11-29 06:19:42.417103135 +0000 UTC m=+0.963306518 container died 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:42 compute-2 systemd[1]: var-lib-containers-storage-overlay-6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e-merged.mount: Deactivated successfully.
Nov 29 06:19:42 compute-2 podman[79218]: 2025-11-29 06:19:42.483535936 +0000 UTC m=+1.029739349 container remove 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Nov 29 06:19:42 compute-2 systemd[1]: libpod-conmon-5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376.scope: Deactivated successfully.
Nov 29 06:19:42 compute-2 sudo[79111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:42 compute-2 sudo[79257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:42 compute-2 sudo[79257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:42 compute-2 sudo[79257]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:42 compute-2 sudo[79282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:42 compute-2 sudo[79282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:42 compute-2 sudo[79282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:42 compute-2 ceph-mgr[77504]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 06:19:42 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:42.665+0000 7f62940bd140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 06:19:42 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'restful'
Nov 29 06:19:42 compute-2 sudo[79307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:42 compute-2 sudo[79307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:42 compute-2 sudo[79307]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:42 compute-2 sudo[79332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:42 compute-2 sudo[79332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:43 compute-2 podman[79397]: 2025-11-29 06:19:43.088757892 +0000 UTC m=+0.034731591 container create c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:43 compute-2 systemd[1]: Started libpod-conmon-c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af.scope.
Nov 29 06:19:43 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:43 compute-2 podman[79397]: 2025-11-29 06:19:43.161450757 +0000 UTC m=+0.107424486 container init c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 06:19:43 compute-2 podman[79397]: 2025-11-29 06:19:43.167896616 +0000 UTC m=+0.113870315 container start c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:43 compute-2 peaceful_brattain[79413]: 167 167
Nov 29 06:19:43 compute-2 podman[79397]: 2025-11-29 06:19:43.07379885 +0000 UTC m=+0.019772569 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:43 compute-2 systemd[1]: libpod-c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af.scope: Deactivated successfully.
Nov 29 06:19:43 compute-2 podman[79397]: 2025-11-29 06:19:43.172685811 +0000 UTC m=+0.118659530 container attach c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:43 compute-2 podman[79397]: 2025-11-29 06:19:43.173098022 +0000 UTC m=+0.119071711 container died c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Nov 29 06:19:43 compute-2 systemd[1]: var-lib-containers-storage-overlay-c7c928a5dec5061b4b572ce892ddd032b20eb698483498c0759ee8680949db63-merged.mount: Deactivated successfully.
Nov 29 06:19:43 compute-2 podman[79397]: 2025-11-29 06:19:43.207933514 +0000 UTC m=+0.153907213 container remove c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Nov 29 06:19:43 compute-2 systemd[1]: libpod-conmon-c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af.scope: Deactivated successfully.
Nov 29 06:19:43 compute-2 ceph-mon[77142]: pgmap v110: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:43 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1241390295' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Nov 29 06:19:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Nov 29 06:19:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:43 compute-2 ceph-mon[77142]: Deploying daemon osd.2 on compute-2
Nov 29 06:19:43 compute-2 ceph-mon[77142]: 2.1b scrub starts
Nov 29 06:19:43 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'rgw'
Nov 29 06:19:43 compute-2 podman[79445]: 2025-11-29 06:19:43.479088019 +0000 UTC m=+0.041846088 container create 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:43 compute-2 systemd[1]: Started libpod-conmon-7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678.scope.
Nov 29 06:19:43 compute-2 podman[79445]: 2025-11-29 06:19:43.46083122 +0000 UTC m=+0.023589299 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:43 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054709 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:43 compute-2 podman[79445]: 2025-11-29 06:19:43.68373358 +0000 UTC m=+0.246491659 container init 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 06:19:43 compute-2 podman[79445]: 2025-11-29 06:19:43.690376384 +0000 UTC m=+0.253134443 container start 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:43 compute-2 podman[79445]: 2025-11-29 06:19:43.69784982 +0000 UTC m=+0.260607879 container attach 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:44 compute-2 ceph-mgr[77504]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 06:19:44 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'rook'
Nov 29 06:19:44 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:44.130+0000 7f62940bd140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 06:19:44 compute-2 ceph-mon[77142]: 2.1b scrub ok
Nov 29 06:19:44 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/264614796' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 29 06:19:44 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test[79461]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 29 06:19:44 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test[79461]:                             [--no-systemd] [--no-tmpfs]
Nov 29 06:19:44 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test[79461]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 29 06:19:44 compute-2 systemd[1]: libpod-7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678.scope: Deactivated successfully.
Nov 29 06:19:44 compute-2 podman[79445]: 2025-11-29 06:19:44.428158323 +0000 UTC m=+0.990916392 container died 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 06:19:44 compute-2 systemd[1]: var-lib-containers-storage-overlay-33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330-merged.mount: Deactivated successfully.
Nov 29 06:19:44 compute-2 podman[79445]: 2025-11-29 06:19:44.490566948 +0000 UTC m=+1.053325007 container remove 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2)
Nov 29 06:19:44 compute-2 systemd[1]: libpod-conmon-7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678.scope: Deactivated successfully.
Nov 29 06:19:44 compute-2 systemd[1]: Reloading.
Nov 29 06:19:44 compute-2 systemd-rc-local-generator[79517]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:44 compute-2 systemd-sysv-generator[79525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:45 compute-2 systemd[1]: Reloading.
Nov 29 06:19:45 compute-2 systemd-sysv-generator[79565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:45 compute-2 systemd-rc-local-generator[79559]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:45 compute-2 ceph-mon[77142]: pgmap v111: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:45 compute-2 systemd[1]: Starting Ceph osd.2 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:19:45 compute-2 podman[79626]: 2025-11-29 06:19:45.67467379 +0000 UTC m=+0.041728684 container create 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Nov 29 06:19:45 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:45 compute-2 podman[79626]: 2025-11-29 06:19:45.734700033 +0000 UTC m=+0.101754937 container init 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:45 compute-2 podman[79626]: 2025-11-29 06:19:45.744186551 +0000 UTC m=+0.111241445 container start 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Nov 29 06:19:45 compute-2 podman[79626]: 2025-11-29 06:19:45.74756483 +0000 UTC m=+0.114619724 container attach 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Nov 29 06:19:45 compute-2 podman[79626]: 2025-11-29 06:19:45.656088423 +0000 UTC m=+0.023143337 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:46 compute-2 ceph-mgr[77504]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:46.372+0000 7f62940bd140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 06:19:46 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'selftest'
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 06:19:46 compute-2 bash[79626]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:19:46 compute-2 bash[79626]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:19:46 compute-2 bash[79626]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:19:46 compute-2 bash[79626]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:46 compute-2 bash[79626]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 06:19:46 compute-2 bash[79626]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:46.651+0000 7f62940bd140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 06:19:46 compute-2 ceph-mgr[77504]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 06:19:46 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'snap_schedule'
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: --> ceph-volume raw activate successful for osd ID: 2
Nov 29 06:19:46 compute-2 bash[79626]: --> ceph-volume raw activate successful for osd ID: 2
Nov 29 06:19:46 compute-2 systemd[1]: libpod-90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a.scope: Deactivated successfully.
Nov 29 06:19:46 compute-2 podman[79626]: 2025-11-29 06:19:46.691671285 +0000 UTC m=+1.058726169 container died 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:46 compute-2 systemd[1]: var-lib-containers-storage-overlay-587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68-merged.mount: Deactivated successfully.
Nov 29 06:19:46 compute-2 podman[79626]: 2025-11-29 06:19:46.751516632 +0000 UTC m=+1.118571526 container remove 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:46 compute-2 ceph-mgr[77504]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 06:19:46 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:46.913+0000 7f62940bd140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 06:19:46 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'stats'
Nov 29 06:19:46 compute-2 podman[79803]: 2025-11-29 06:19:46.940907214 +0000 UTC m=+0.041696553 container create 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:47 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:47 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:47 compute-2 podman[79803]: 2025-11-29 06:19:47.013566308 +0000 UTC m=+0.114355677 container init 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:19:47 compute-2 podman[79803]: 2025-11-29 06:19:47.020826038 +0000 UTC m=+0.121615387 container start 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:47 compute-2 podman[79803]: 2025-11-29 06:19:46.924658559 +0000 UTC m=+0.025447908 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:47 compute-2 bash[79803]: 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892
Nov 29 06:19:47 compute-2 systemd[1]: Started Ceph osd.2 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:47 compute-2 ceph-osd[79822]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:19:47 compute-2 ceph-osd[79822]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Nov 29 06:19:47 compute-2 ceph-osd[79822]: pidfile_write: ignore empty --pid-file
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 06:19:47 compute-2 sudo[79332]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:47 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'status'
Nov 29 06:19:47 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2969688060' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Nov 29 06:19:47 compute-2 sudo[79835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:47 compute-2 sudo[79835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:47 compute-2 sudo[79835]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 06:19:47 compute-2 sudo[79860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:47 compute-2 sudo[79860]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:47 compute-2 sudo[79860]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:47 compute-2 sudo[79887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:47 compute-2 sudo[79887]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:47 compute-2 sudo[79887]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:47 compute-2 ceph-mgr[77504]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 06:19:47 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'telegraf'
Nov 29 06:19:47 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:47.453+0000 7f62940bd140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 06:19:47 compute-2 sudo[79912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- raw list --format json
Nov 29 06:19:47 compute-2 sudo[79912]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:47 compute-2 ceph-osd[79822]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Nov 29 06:19:47 compute-2 ceph-osd[79822]: load: jerasure load: lrc 
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 06:19:47 compute-2 ceph-mgr[77504]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 06:19:47 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:47.700+0000 7f62940bd140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 06:19:47 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'telemetry'
Nov 29 06:19:47 compute-2 podman[79982]: 2025-11-29 06:19:47.86441752 +0000 UTC m=+0.068835155 container create d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2)
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:19:47 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 06:19:47 compute-2 systemd[1]: Started libpod-conmon-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope.
Nov 29 06:19:47 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:47 compute-2 podman[79982]: 2025-11-29 06:19:47.847862076 +0000 UTC m=+0.052279731 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:47 compute-2 podman[79982]: 2025-11-29 06:19:47.955877786 +0000 UTC m=+0.160295451 container init d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef)
Nov 29 06:19:47 compute-2 podman[79982]: 2025-11-29 06:19:47.972583253 +0000 UTC m=+0.177000888 container start d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:47 compute-2 podman[79982]: 2025-11-29 06:19:47.976350492 +0000 UTC m=+0.180768137 container attach d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Nov 29 06:19:47 compute-2 systemd[1]: libpod-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope: Deactivated successfully.
Nov 29 06:19:47 compute-2 musing_williamson[80002]: 167 167
Nov 29 06:19:47 compute-2 conmon[80002]: conmon d371ac0ff2da37e5369a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope/container/memory.events
Nov 29 06:19:47 compute-2 podman[79982]: 2025-11-29 06:19:47.980012388 +0000 UTC m=+0.184430043 container died d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:48 compute-2 systemd[1]: var-lib-containers-storage-overlay-790d7b2f8c6a7e0d4a25b7ac07631a5fdb815c616996821c22007e17ff089846-merged.mount: Deactivated successfully.
Nov 29 06:19:48 compute-2 podman[79982]: 2025-11-29 06:19:48.043560033 +0000 UTC m=+0.247977688 container remove d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 06:19:48 compute-2 systemd[1]: libpod-conmon-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope: Deactivated successfully.
Nov 29 06:19:48 compute-2 ceph-osd[79822]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs mount
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs mount shared_bdev_used = 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: RocksDB version: 7.9.2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Git sha 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: DB SUMMARY
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: DB Session ID:  IRL5VW3ZF53NYTB339J7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: CURRENT file:  CURRENT
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.error_if_exists: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.create_if_missing: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                     Options.env: 0x55fed0a93f10
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                Options.info_log: 0x55fecfc80c80
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                              Options.statistics: (nil)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.use_fsync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                              Options.db_log_dir: 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.write_buffer_manager: 0x55fed0ba8460
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.unordered_write: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.row_cache: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                              Options.wal_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.two_write_queues: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.wal_compression: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.atomic_flush: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_background_jobs: 4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_background_compactions: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_subcompactions: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.max_open_files: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Compression algorithms supported:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kZSTD supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kXpressCompression supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kBZip2Compression supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kLZ4Compression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kZlibCompression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kLZ4HCCompression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kSnappyCompression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76dd0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76dd0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76dd0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76dd0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76dd0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76dd0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76dd0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806e0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806e0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806e0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc76430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2f5dfbc6-dc11-46fc-bc15-f484fccc197b
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188179388, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188179602, "job": 1, "event": "recovery_finished"}
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: freelist init
Nov 29 06:19:48 compute-2 ceph-osd[79822]: freelist _read_cfg
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs umount
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 06:19:48 compute-2 ceph-mon[77142]: pgmap v112: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:48 compute-2 ceph-mon[77142]: 2.19 scrub starts
Nov 29 06:19:48 compute-2 ceph-mon[77142]: 2.19 scrub ok
Nov 29 06:19:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:48 compute-2 podman[80025]: 2025-11-29 06:19:48.208443652 +0000 UTC m=+0.047231499 container create 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 06:19:48 compute-2 systemd[1]: Started libpod-conmon-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope.
Nov 29 06:19:48 compute-2 podman[80025]: 2025-11-29 06:19:48.188970521 +0000 UTC m=+0.027758458 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:48 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:48 compute-2 podman[80025]: 2025-11-29 06:19:48.305858944 +0000 UTC m=+0.144646831 container init 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:48 compute-2 podman[80025]: 2025-11-29 06:19:48.313461403 +0000 UTC m=+0.152249280 container start 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Nov 29 06:19:48 compute-2 podman[80025]: 2025-11-29 06:19:48.316732929 +0000 UTC m=+0.155520876 container attach 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:48 compute-2 ceph-mgr[77504]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 06:19:48 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'test_orchestrator'
Nov 29 06:19:48 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:48.347+0000 7f62940bd140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs mount
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluefs mount shared_bdev_used = 4718592
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: RocksDB version: 7.9.2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Git sha 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: DB SUMMARY
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: DB Session ID:  IRL5VW3ZF53NYTB339J6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: CURRENT file:  CURRENT
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.error_if_exists: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.create_if_missing: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                     Options.env: 0x55fecfdc84d0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                Options.info_log: 0x55fecfc81920
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                              Options.statistics: (nil)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.use_fsync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                              Options.db_log_dir: 
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.write_buffer_manager: 0x55fed0ba8460
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.unordered_write: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.row_cache: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                              Options.wal_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.two_write_queues: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.wal_compression: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.atomic_flush: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_background_jobs: 4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_background_compactions: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_subcompactions: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.max_open_files: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Compression algorithms supported:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kZSTD supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kXpressCompression supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kBZip2Compression supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kLZ4Compression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kZlibCompression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kLZ4HCCompression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         kSnappyCompression supported: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc77350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc77350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc77350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc77350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc77350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc77350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc77350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a040)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc774b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a040)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc774b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a040)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fecfc774b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2f5dfbc6-dc11-46fc-bc15-f484fccc197b
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188458299, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188464068, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397188, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f5dfbc6-dc11-46fc-bc15-f484fccc197b", "db_session_id": "IRL5VW3ZF53NYTB339J6", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188466418, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397188, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f5dfbc6-dc11-46fc-bc15-f484fccc197b", "db_session_id": "IRL5VW3ZF53NYTB339J6", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188468743, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397188, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f5dfbc6-dc11-46fc-bc15-f484fccc197b", "db_session_id": "IRL5VW3ZF53NYTB339J6", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188470021, "job": 1, "event": "recovery_finished"}
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55fed0aabc00
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: DB pointer 0x55fed0b91a00
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Nov 29 06:19:48 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:19:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:19:48 compute-2 ceph-osd[79822]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 29 06:19:48 compute-2 ceph-osd[79822]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 29 06:19:48 compute-2 ceph-osd[79822]: _get_class not permitted to load lua
Nov 29 06:19:48 compute-2 ceph-osd[79822]: _get_class not permitted to load sdk
Nov 29 06:19:48 compute-2 ceph-osd[79822]: _get_class not permitted to load test_remote_reads
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2 0 load_pgs
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2 0 load_pgs opened 0 pgs
Nov 29 06:19:48 compute-2 ceph-osd[79822]: osd.2 0 log_to_monitors true
Nov 29 06:19:48 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2[79818]: 2025-11-29T06:19:48.496+0000 7fd15ca6c740 -1 osd.2 0 log_to_monitors true
Nov 29 06:19:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Nov 29 06:19:48 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 06:19:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:49 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:49.076+0000 7f62940bd140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-2 ceph-mgr[77504]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'volumes'
Nov 29 06:19:49 compute-2 clever_almeida[80235]: {
Nov 29 06:19:49 compute-2 clever_almeida[80235]:     "f86a06f9-a09f-46de-8440-929a842d2c66": {
Nov 29 06:19:49 compute-2 clever_almeida[80235]:         "ceph_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 06:19:49 compute-2 clever_almeida[80235]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 29 06:19:49 compute-2 clever_almeida[80235]:         "osd_id": 2,
Nov 29 06:19:49 compute-2 clever_almeida[80235]:         "osd_uuid": "f86a06f9-a09f-46de-8440-929a842d2c66",
Nov 29 06:19:49 compute-2 clever_almeida[80235]:         "type": "bluestore"
Nov 29 06:19:49 compute-2 clever_almeida[80235]:     }
Nov 29 06:19:49 compute-2 clever_almeida[80235]: }
Nov 29 06:19:49 compute-2 systemd[1]: libpod-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope: Deactivated successfully.
Nov 29 06:19:49 compute-2 systemd[1]: libpod-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope: Consumed 1.038s CPU time.
Nov 29 06:19:49 compute-2 podman[80025]: 2025-11-29 06:19:49.346747354 +0000 UTC m=+1.185535201 container died 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Nov 29 06:19:49 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 29 06:19:49 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 29 06:19:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Nov 29 06:19:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]} v 0) v1
Nov 29 06:19:49 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 06:19:49 compute-2 systemd[1]: var-lib-containers-storage-overlay-f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a-merged.mount: Deactivated successfully.
Nov 29 06:19:49 compute-2 ceph-mon[77142]: pgmap v113: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:49 compute-2 ceph-mon[77142]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 06:19:49 compute-2 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 06:19:49 compute-2 ceph-mon[77142]: from='client.14301 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:49 compute-2 podman[80025]: 2025-11-29 06:19:49.645970114 +0000 UTC m=+1.484758001 container remove 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507)
Nov 29 06:19:49 compute-2 systemd[1]: libpod-conmon-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope: Deactivated successfully.
Nov 29 06:19:49 compute-2 sudo[79912]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:49 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:49.890+0000 7f62940bd140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-2 ceph-mgr[77504]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-2 ceph-mgr[77504]: mgr[py] Loading python module 'zabbix'
Nov 29 06:19:50 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:50.160+0000 7f62940bd140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 06:19:50 compute-2 ceph-mgr[77504]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 06:19:50 compute-2 ceph-mgr[77504]: ms_deliver_dispatch: unhandled message 0x563492cf5080 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Nov 29 06:19:50 compute-2 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:19:50 compute-2 sudo[80485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:50 compute-2 sudo[80485]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-2 sudo[80485]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-2 sudo[80510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:19:50 compute-2 sudo[80510]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-2 sudo[80510]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-2 sudo[80535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:50 compute-2 sudo[80535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-2 sudo[80535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-2 sudo[80560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:50 compute-2 sudo[80560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-2 sudo[80560]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-2 sudo[80585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:50 compute-2 sudo[80585]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-2 sudo[80585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-2 sudo[80610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:19:50 compute-2 sudo[80610]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:51 compute-2 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Nov 29 06:19:51 compute-2 ceph-mon[77142]: osdmap e32: 3 total, 2 up, 3 in
Nov 29 06:19:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:51 compute-2 ceph-mon[77142]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 06:19:51 compute-2 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 06:19:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:51 compute-2 ceph-mon[77142]: pgmap v115: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:51 compute-2 ceph-mon[77142]: Standby manager daemon compute-2.ngsyhe started
Nov 29 06:19:51 compute-2 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:19:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e33 e33: 3 total, 2 up, 3 in
Nov 29 06:19:52 compute-2 ceph-osd[79822]: osd.2 0 done with init, starting boot process
Nov 29 06:19:52 compute-2 ceph-osd[79822]: osd.2 0 start_boot
Nov 29 06:19:52 compute-2 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 29 06:19:52 compute-2 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 29 06:19:52 compute-2 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 29 06:19:52 compute-2 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 29 06:19:52 compute-2 ceph-osd[79822]: osd.2 0  bench count 12288000 bsize 4 KiB
Nov 29 06:19:52 compute-2 podman[80707]: 2025-11-29 06:19:52.117091945 +0000 UTC m=+0.943771607 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:52 compute-2 ceph-mon[77142]: from='client.14307 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:52 compute-2 podman[80707]: 2025-11-29 06:19:52.465401501 +0000 UTC m=+1.292081083 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:19:54 compute-2 ceph-mon[77142]: purged_snaps scrub starts
Nov 29 06:19:54 compute-2 ceph-mon[77142]: purged_snaps scrub ok
Nov 29 06:19:54 compute-2 ceph-mon[77142]: 2.15 scrub starts
Nov 29 06:19:54 compute-2 ceph-mon[77142]: 2.15 scrub ok
Nov 29 06:19:54 compute-2 ceph-mon[77142]: pgmap v116: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:54 compute-2 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Nov 29 06:19:54 compute-2 ceph-mon[77142]: osdmap e33: 3 total, 2 up, 3 in
Nov 29 06:19:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:54 compute-2 ceph-mon[77142]: mgrmap e9: compute-0.vxabpq(active, since 2m), standbys: compute-2.ngsyhe
Nov 29 06:19:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr metadata", "who": "compute-2.ngsyhe", "id": "compute-2.ngsyhe"}]: dispatch
Nov 29 06:19:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:54 compute-2 ceph-mon[77142]: from='client.14313 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:54 compute-2 ceph-mon[77142]: Standby manager daemon compute-1.gaxpay started
Nov 29 06:19:54 compute-2 sudo[80610]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:55 compute-2 sshd-session[80795]: Invalid user jira from 92.118.39.92 port 43364
Nov 29 06:19:55 compute-2 sshd-session[80795]: Connection closed by invalid user jira 92.118.39.92 port 43364 [preauth]
Nov 29 06:19:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e34 e34: 3 total, 2 up, 3 in
Nov 29 06:19:56 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:56 compute-2 ceph-mon[77142]: pgmap v118: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:56 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:56 compute-2 ceph-mon[77142]: from='client.14319 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:57 compute-2 sudo[80797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:57 compute-2 sudo[80797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:57 compute-2 sudo[80797]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:57 compute-2 sudo[80822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:57 compute-2 sudo[80822]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:57 compute-2 sudo[80822]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:57 compute-2 sudo[80847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:57 compute-2 sudo[80847]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:57 compute-2 sudo[80847]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:57 compute-2 sudo[80872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:19:57 compute-2 sudo[80872]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:57 compute-2 sudo[80872]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:57 compute-2 sudo[80929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:57 compute-2 sudo[80929]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:57 compute-2 sudo[80929]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:57 compute-2 sudo[80954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:57 compute-2 sudo[80954]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:57 compute-2 sudo[80954]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:57 compute-2 sudo[80979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:57 compute-2 sudo[80979]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:58 compute-2 sudo[80979]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:58 compute-2 sudo[81004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- inventory --format=json-pretty --filter-for-batch
Nov 29 06:19:58 compute-2 sudo[81004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:58 compute-2 ceph-mon[77142]: mgrmap e10: compute-0.vxabpq(active, since 3m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr metadata", "who": "compute-1.gaxpay", "id": "compute-1.gaxpay"}]: dispatch
Nov 29 06:19:58 compute-2 ceph-mon[77142]: pgmap v119: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:58 compute-2 ceph-mon[77142]: osdmap e34: 3 total, 2 up, 3 in
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/4274267034' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:58 compute-2 podman[81069]: 2025-11-29 06:19:58.375402048 +0000 UTC m=+0.024927735 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:59 compute-2 podman[81069]: 2025-11-29 06:19:59.206975763 +0000 UTC m=+0.856501430 container create e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 06:19:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e35 e35: 3 total, 2 up, 3 in
Nov 29 06:19:59 compute-2 systemd[1]: Started libpod-conmon-e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8.scope.
Nov 29 06:19:59 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:19:59 compute-2 podman[81069]: 2025-11-29 06:19:59.343026437 +0000 UTC m=+0.992552134 container init e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:59 compute-2 podman[81069]: 2025-11-29 06:19:59.350186075 +0000 UTC m=+0.999711742 container start e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 06:19:59 compute-2 elegant_dubinsky[81085]: 167 167
Nov 29 06:19:59 compute-2 systemd[1]: libpod-e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8.scope: Deactivated successfully.
Nov 29 06:19:59 compute-2 podman[81069]: 2025-11-29 06:19:59.389208017 +0000 UTC m=+1.038733704 container attach e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 06:19:59 compute-2 podman[81069]: 2025-11-29 06:19:59.389641169 +0000 UTC m=+1.039166836 container died e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:59 compute-2 systemd[1]: var-lib-containers-storage-overlay-3f65f410c47ef2da0e7a20b01738f1e234ab3c379e803a03a343a02133afc89f-merged.mount: Deactivated successfully.
Nov 29 06:19:59 compute-2 podman[81069]: 2025-11-29 06:19:59.661544282 +0000 UTC m=+1.311069949 container remove e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef)
Nov 29 06:19:59 compute-2 systemd[1]: libpod-conmon-e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8.scope: Deactivated successfully.
Nov 29 06:19:59 compute-2 podman[81108]: 2025-11-29 06:19:59.820062335 +0000 UTC m=+0.033421856 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:20:00 compute-2 ceph-mon[77142]: pgmap v121: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:00 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:00 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:01 compute-2 podman[81108]: 2025-11-29 06:20:01.131421251 +0000 UTC m=+1.344780722 container create 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Nov 29 06:20:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:02 compute-2 systemd[1]: Started libpod-conmon-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope.
Nov 29 06:20:02 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:20:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e36 e36: 3 total, 2 up, 3 in
Nov 29 06:20:02 compute-2 podman[81108]: 2025-11-29 06:20:02.860531032 +0000 UTC m=+3.073890483 container init 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Nov 29 06:20:02 compute-2 podman[81108]: 2025-11-29 06:20:02.869642851 +0000 UTC m=+3.083002282 container start 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507)
Nov 29 06:20:02 compute-2 podman[81108]: 2025-11-29 06:20:02.899278587 +0000 UTC m=+3.112638018 container attach 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:20:03 compute-2 ceph-mon[77142]: osdmap e35: 3 total, 2 up, 3 in
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2162770432' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Nov 29 06:20:03 compute-2 ceph-mon[77142]: pgmap v123: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:03 compute-2 ceph-mon[77142]: Health detail: HEALTH_ERR 1 filesystem is offline; 1 filesystem is online with fewer MDS than max_mds
Nov 29 06:20:03 compute-2 ceph-mon[77142]: [ERR] MDS_ALL_DOWN: 1 filesystem is offline
Nov 29 06:20:03 compute-2 ceph-mon[77142]:     fs cephfs is offline because no MDS is active for it.
Nov 29 06:20:03 compute-2 ceph-mon[77142]: [WRN] MDS_UP_LESS_THAN_MAX: 1 filesystem is online with fewer MDS than max_mds
Nov 29 06:20:03 compute-2 ceph-mon[77142]:     fs cephfs has 0 MDS online, but wants 1
Nov 29 06:20:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:04 compute-2 stoic_napier[81124]: [
Nov 29 06:20:04 compute-2 stoic_napier[81124]:     {
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "available": false,
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "ceph_device": false,
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "lsm_data": {},
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "lvs": [],
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "path": "/dev/sr0",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "rejected_reasons": [
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "Insufficient space (<5GB)",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "Has a FileSystem"
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         ],
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         "sys_api": {
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "actuators": null,
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "device_nodes": "sr0",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "devname": "sr0",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "human_readable_size": "482.00 KB",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "id_bus": "ata",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "model": "QEMU DVD-ROM",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "nr_requests": "2",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "parent": "/dev/sr0",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "partitions": {},
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "path": "/dev/sr0",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "removable": "1",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "rev": "2.5+",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "ro": "0",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "rotational": "1",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "sas_address": "",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "sas_device_handle": "",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "scheduler_mode": "mq-deadline",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "sectors": 0,
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "sectorsize": "2048",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "size": 493568.0,
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "support_discard": "2048",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "type": "disk",
Nov 29 06:20:04 compute-2 stoic_napier[81124]:             "vendor": "QEMU"
Nov 29 06:20:04 compute-2 stoic_napier[81124]:         }
Nov 29 06:20:04 compute-2 stoic_napier[81124]:     }
Nov 29 06:20:04 compute-2 stoic_napier[81124]: ]
Nov 29 06:20:04 compute-2 systemd[1]: libpod-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope: Deactivated successfully.
Nov 29 06:20:04 compute-2 podman[81108]: 2025-11-29 06:20:04.087331103 +0000 UTC m=+4.300690534 container died 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 06:20:04 compute-2 systemd[1]: libpod-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope: Consumed 1.214s CPU time.
Nov 29 06:20:04 compute-2 systemd[1]: var-lib-containers-storage-overlay-1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7-merged.mount: Deactivated successfully.
Nov 29 06:20:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e37 e37: 3 total, 2 up, 3 in
Nov 29 06:20:04 compute-2 podman[81108]: 2025-11-29 06:20:04.201958316 +0000 UTC m=+4.415317747 container remove 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2)
Nov 29 06:20:04 compute-2 systemd[1]: libpod-conmon-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope: Deactivated successfully.
Nov 29 06:20:04 compute-2 sudo[81004]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-2 ceph-mon[77142]: osdmap e36: 3 total, 2 up, 3 in
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: pgmap v125: 100 pgs: 62 unknown, 38 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3618548784' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-2 ceph-mon[77142]: osdmap e37: 3 total, 2 up, 3 in
Nov 29 06:20:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-2 sudo[82297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:05 compute-2 sudo[82297]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:05 compute-2 sudo[82297]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:05 compute-2 sudo[82322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 29 06:20:05 compute-2 sudo[82322]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:05 compute-2 sudo[82322]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82347]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e38 e38: 3 total, 2 up, 3 in
Nov 29 06:20:06 compute-2 sudo[82347]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:06 compute-2 sudo[82372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph
Nov 29 06:20:06 compute-2 sudo[82372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82372]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82397]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-2 sudo[82422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82422]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82447]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:20:06 compute-2 sudo[82472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82472]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82497]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-2 sudo[82522]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82522]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82570]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82570]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-2 sudo[82595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82595]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82620]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82620]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-2 sudo[82645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82645]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82670]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82670]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 29 06:20:06 compute-2 sudo[82695]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82695]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-2 sudo[82720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-2 sudo[82720]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-2 sudo[82720]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:20:07 compute-2 sudo[82745]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82745]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-2 sudo[82770]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82770]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:20:07 compute-2 sudo[82795]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82795]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-2 sudo[82820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82820]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:07 compute-2 sudo[82845]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82845]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-2 sudo[82870]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82870]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:20:07 compute-2 sudo[82895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-2 sudo[82920]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82920]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:07 compute-2 sudo[82945]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82945]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-2 sudo[82993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-2 sudo[82993]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-2 sudo[82993]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-2 sudo[83018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:08 compute-2 sudo[83018]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-2 sudo[83018]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-2 sudo[83043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:08 compute-2 sudo[83043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-2 sudo[83043]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-2 sudo[83068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:08 compute-2 sudo[83068]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-2 sudo[83068]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-2 sudo[83093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:08 compute-2 sudo[83093]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-2 sudo[83093]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-2 sudo[83118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:08 compute-2 sudo[83118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-2 sudo[83118]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 5.398 iops: 1381.921 elapsed_sec: 2.171
Nov 29 06:20:09 compute-2 ceph-osd[79822]: log_channel(cluster) log [WRN] : OSD bench result of 1381.921175 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 0 waiting for initial osdmap
Nov 29 06:20:09 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2[79818]: 2025-11-29T06:20:09.161+0000 7fd159203640 -1 osd.2 0 waiting for initial osdmap
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 38 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 38 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 38 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 38 check_osdmap_features require_osd_release unknown -> reef
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 38 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 38 set_numa_affinity not setting numa affinity
Nov 29 06:20:09 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2[79818]: 2025-11-29T06:20:09.192+0000 7fd154014640 -1 osd.2 38 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 06:20:09 compute-2 ceph-osd[79822]: osd.2 38 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Nov 29 06:20:10 compute-2 ceph-osd[79822]: osd.2 38 tick checking mon for new map
Nov 29 06:20:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:13 compute-2 ceph-mon[77142]: pgmap v126: 100 pgs: 62 unknown, 38 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:20:13 compute-2 ceph-mon[77142]: 4.1 scrub starts
Nov 29 06:20:13 compute-2 ceph-mon[77142]: 4.1 scrub ok
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:13 compute-2 ceph-mon[77142]: 4.2 scrub starts
Nov 29 06:20:13 compute-2 ceph-mon[77142]: 4.2 scrub ok
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:20:13 compute-2 ceph-mon[77142]: Adjusting osd_memory_target on compute-2 to 128.0M
Nov 29 06:20:13 compute-2 ceph-mon[77142]: Unable to set osd_memory_target on compute-2 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:20:13 compute-2 ceph-mon[77142]: Updating compute-0:/etc/ceph/ceph.conf
Nov 29 06:20:13 compute-2 ceph-mon[77142]: Updating compute-1:/etc/ceph/ceph.conf
Nov 29 06:20:13 compute-2 ceph-mon[77142]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3247558833' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Nov 29 06:20:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:14 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e39 e39: 3 total, 2 up, 3 in
Nov 29 06:20:16 compute-2 ceph-mon[77142]: pgmap v128: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:20:16 compute-2 ceph-mon[77142]: osdmap e38: 3 total, 2 up, 3 in
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.3 scrub starts
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.3 scrub ok
Nov 29 06:20:16 compute-2 ceph-mon[77142]: Updating compute-0:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:16 compute-2 ceph-mon[77142]: Updating compute-1:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:16 compute-2 ceph-mon[77142]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: pgmap v130: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: OSD bench result of 1381.921175 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 06:20:16 compute-2 ceph-mon[77142]: pgmap v131: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.4 scrub starts
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.4 scrub ok
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: pgmap v132: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.5 scrub starts
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.5 scrub ok
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.6 scrub starts
Nov 29 06:20:16 compute-2 ceph-mon[77142]: 4.6 scrub ok
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 40 state: booting -> active
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:17 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.a( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.9( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.2( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.4( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1a( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.18( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1c( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-mon[77142]: pgmap v133: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:19 compute-2 ceph-mon[77142]: osdmap e39: 3 total, 2 up, 3 in
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:19 compute-2 ceph-mon[77142]: pgmap v135: 146 pgs: 77 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:19 compute-2 ceph-mon[77142]: osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518] boot
Nov 29 06:20:19 compute-2 ceph-mon[77142]: osdmap e40: 3 total, 3 up, 3 in
Nov 29 06:20:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.12( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.14( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.17( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.11( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.3( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.8( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.6( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1e( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.7( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.6( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.5( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.d( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1f( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.a( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.b( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.f( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.10( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.15( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.13( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.10( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.16( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.e( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.17( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1d( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1b( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.19( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1b( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1d( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:27 compute-2 sshd-session[71340]: Received disconnect from 38.102.83.107 port 47870:11: disconnected by user
Nov 29 06:20:27 compute-2 sshd-session[71340]: Disconnected from user zuul 38.102.83.107 port 47870
Nov 29 06:20:27 compute-2 sshd-session[71337]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:20:27 compute-2 systemd[1]: session-19.scope: Deactivated successfully.
Nov 29 06:20:27 compute-2 systemd[1]: session-19.scope: Consumed 9.258s CPU time.
Nov 29 06:20:28 compute-2 systemd-logind[784]: Session 19 logged out. Waiting for processes to exit.
Nov 29 06:20:28 compute-2 systemd-logind[784]: Removed session 19.
Nov 29 06:20:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.10( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.c( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.6( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=40/41 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.0( empty local-lis/les=40/41 n=0 ec=14/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.6( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.3( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.13( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.19( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.2( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.4( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.10( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.17( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.14( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.12( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.10( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.17( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.12( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:29 compute-2 ceph-mon[77142]: pgmap v137: 177 pgs: 108 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:29 compute-2 ceph-mon[77142]: 4.7 scrub starts
Nov 29 06:20:29 compute-2 ceph-mon[77142]: 4.7 scrub ok
Nov 29 06:20:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:30 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Nov 29 06:20:30 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Nov 29 06:20:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:31 compute-2 ceph-mon[77142]: pgmap v138: 177 pgs: 7 peering, 108 unknown, 62 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-2 ceph-mon[77142]: pgmap v139: 177 pgs: 24 peering, 93 unknown, 60 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-2 ceph-mon[77142]: pgmap v140: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-2 ceph-mon[77142]: pgmap v141: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-2 ceph-mon[77142]: 4.8 deep-scrub starts
Nov 29 06:20:31 compute-2 ceph-mon[77142]: 4.8 deep-scrub ok
Nov 29 06:20:31 compute-2 ceph-mon[77142]: pgmap v142: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:31 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:31 compute-2 ceph-mon[77142]: osdmap e41: 3 total, 3 up, 3 in
Nov 29 06:20:31 compute-2 ceph-mon[77142]: pgmap v144: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:31 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:20:31 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:20:31 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Nov 29 06:20:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Nov 29 06:20:32 compute-2 ceph-mon[77142]: 4.9 deep-scrub starts
Nov 29 06:20:32 compute-2 ceph-mon[77142]: 4.9 deep-scrub ok
Nov 29 06:20:32 compute-2 ceph-mon[77142]: 5.c deep-scrub starts
Nov 29 06:20:32 compute-2 ceph-mon[77142]: 5.c deep-scrub ok
Nov 29 06:20:34 compute-2 ceph-mon[77142]: pgmap v145: 177 pgs: 78 peering, 99 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:34 compute-2 ceph-mon[77142]: 4.a deep-scrub starts
Nov 29 06:20:34 compute-2 ceph-mon[77142]: 4.a deep-scrub ok
Nov 29 06:20:34 compute-2 ceph-mon[77142]: 5.2 scrub starts
Nov 29 06:20:34 compute-2 ceph-mon[77142]: 5.2 scrub ok
Nov 29 06:20:34 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.a deep-scrub starts
Nov 29 06:20:34 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.a deep-scrub ok
Nov 29 06:20:35 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Nov 29 06:20:35 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Nov 29 06:20:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.14( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.8( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.2( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.19( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.14( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.1d( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.9( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[6.1( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.3( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.6( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1f( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1d( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.15( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.16( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.1f( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524729729s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942543030s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524394035s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942268372s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524551392s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942417145s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.17( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.525029182s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942920685s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524342537s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942268372s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524610519s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942543030s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.17( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524973869s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942920685s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524315834s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942390442s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524477005s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942417145s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524250984s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942390442s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.12( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521661758s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939975739s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521576881s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939945221s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521528244s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939945221s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.12( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521575928s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939975739s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521255493s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939750671s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.14( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521340370s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939865112s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521452904s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939983368s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521198273s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939750671s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.14( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521246910s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939865112s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521417618s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939983368s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519620895s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938369751s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519574165s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938369751s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519407272s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938304901s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519376755s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938304901s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519334793s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938304901s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519217491s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938220978s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519193649s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938220978s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519282341s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938304901s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.520560265s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939620972s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518817902s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938079834s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518780708s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938079834s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.4( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518674850s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938003540s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518661499s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937999725s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518634796s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937999725s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.520272255s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939620972s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.4( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518590927s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938003540s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.2( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518420219s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937969208s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518275261s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937923431s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.2( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518387794s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937969208s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518247604s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937923431s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518118858s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937870026s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.13( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517918587s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937751770s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518052101s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937870026s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.13( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517898560s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937751770s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517808914s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937728882s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.19( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517912865s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937812805s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517782211s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937728882s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.19( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517834663s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937812805s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517837524s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937881470s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517531395s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937652588s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517782211s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937881470s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517941475s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938095093s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517477989s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937652588s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517900467s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938095093s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517189026s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937469482s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.3( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517422676s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937702179s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517161369s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937469482s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517057419s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937427521s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517015457s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937427521s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516923904s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937355042s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516735077s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937217712s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.3( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517365456s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937702179s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516711235s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937217712s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516699791s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937400818s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516376495s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937091827s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516853333s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937355042s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.6( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516497612s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937255859s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516351700s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937091827s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516664505s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937400818s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516343117s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937183380s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516312599s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937183380s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.6( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516367912s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937255859s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515216827s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936321259s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515192032s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936321259s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515998840s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937152863s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515906334s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937076569s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515972137s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937152863s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515867233s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937076569s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515160561s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936397552s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514899254s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936237335s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515081406s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936397552s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514871597s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936237335s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514591217s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936069489s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514548302s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936069489s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518133163s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939689636s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518104553s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939689636s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.10( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509164810s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.930801392s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509134293s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.930805206s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514425278s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936149597s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514390945s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936149597s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.10( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509114265s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.930801392s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509103775s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.930805206s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:36 compute-2 ceph-mon[77142]: pgmap v146: 177 pgs: 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-2 ceph-mon[77142]: 7.1 scrub starts
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-2 ceph-mon[77142]: osdmap e42: 3 total, 3 up, 3 in
Nov 29 06:20:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 7.1 scrub ok
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 3.a deep-scrub starts
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 3.a deep-scrub ok
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 7.2 scrub starts
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 4.b scrub starts
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 4.b scrub ok
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 5.4 scrub starts
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 5.4 scrub ok
Nov 29 06:20:37 compute-2 ceph-mon[77142]: pgmap v148: 177 pgs: 14 peering, 163 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 7.7 scrub starts
Nov 29 06:20:37 compute-2 ceph-mon[77142]: 7.7 scrub ok
Nov 29 06:20:37 compute-2 ceph-mon[77142]: osdmap e43: 3 total, 3 up, 3 in
Nov 29 06:20:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1c( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.1f( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.15( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.16( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.11( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1f( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.5( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1d( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[6.1( empty local-lis/les=42/44 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.3( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.6( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.1d( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.9( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.a( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.14( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.19( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.2( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.14( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.8( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:38 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Nov 29 06:20:38 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Nov 29 06:20:38 compute-2 ceph-mon[77142]: 4.f scrub starts
Nov 29 06:20:38 compute-2 ceph-mon[77142]: 4.f scrub ok
Nov 29 06:20:38 compute-2 ceph-mon[77142]: osdmap e44: 3 total, 3 up, 3 in
Nov 29 06:20:41 compute-2 ceph-mon[77142]: pgmap v150: 177 pgs: 55 peering, 122 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 06:20:41 compute-2 ceph-mon[77142]: 7.c scrub starts
Nov 29 06:20:41 compute-2 ceph-mon[77142]: 7.c scrub ok
Nov 29 06:20:41 compute-2 sudo[83145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:41 compute-2 sudo[83145]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:41 compute-2 sudo[83145]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:41 compute-2 sudo[83170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:20:41 compute-2 sudo[83170]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:41 compute-2 sudo[83170]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:41 compute-2 sudo[83195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:41 compute-2 sudo[83195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:41 compute-2 sudo[83195]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:41 compute-2 sudo[83220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:20:41 compute-2 sudo[83220]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:42 compute-2 podman[83286]: 2025-11-29 06:20:41.910539765 +0000 UTC m=+0.021892852 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:20:42 compute-2 podman[83286]: 2025-11-29 06:20:42.196496928 +0000 UTC m=+0.307849995 container create 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Nov 29 06:20:42 compute-2 ceph-mon[77142]: 3.9 scrub starts
Nov 29 06:20:42 compute-2 ceph-mon[77142]: 3.9 scrub ok
Nov 29 06:20:42 compute-2 ceph-mon[77142]: pgmap v152: 177 pgs: 75 peering, 102 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:42 compute-2 ceph-mon[77142]: 4.10 scrub starts
Nov 29 06:20:42 compute-2 ceph-mon[77142]: 4.10 scrub ok
Nov 29 06:20:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 06:20:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:42 compute-2 ceph-mon[77142]: Deploying daemon rgw.rgw.compute-2.pkypgd on compute-2
Nov 29 06:20:42 compute-2 systemd[1]: Started libpod-conmon-2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f.scope.
Nov 29 06:20:42 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:20:42 compute-2 podman[83286]: 2025-11-29 06:20:42.441639148 +0000 UTC m=+0.552992225 container init 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:20:42 compute-2 podman[83286]: 2025-11-29 06:20:42.454463222 +0000 UTC m=+0.565816259 container start 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:20:42 compute-2 podman[83286]: 2025-11-29 06:20:42.459520404 +0000 UTC m=+0.570873491 container attach 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Nov 29 06:20:42 compute-2 zen_thompson[83303]: 167 167
Nov 29 06:20:42 compute-2 systemd[1]: libpod-2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f.scope: Deactivated successfully.
Nov 29 06:20:42 compute-2 podman[83286]: 2025-11-29 06:20:42.463231111 +0000 UTC m=+0.574584148 container died 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:20:42 compute-2 systemd[1]: var-lib-containers-storage-overlay-54fb1e567630240ce82ee3154f21ae2e072a9f814c027988ca3eabb3df1b25f0-merged.mount: Deactivated successfully.
Nov 29 06:20:42 compute-2 podman[83286]: 2025-11-29 06:20:42.514252841 +0000 UTC m=+0.625605868 container remove 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:20:42 compute-2 systemd[1]: libpod-conmon-2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f.scope: Deactivated successfully.
Nov 29 06:20:42 compute-2 systemd[1]: Reloading.
Nov 29 06:20:42 compute-2 systemd-sysv-generator[83350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:20:42 compute-2 systemd-rc-local-generator[83344]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:20:42 compute-2 systemd[1]: Reloading.
Nov 29 06:20:42 compute-2 systemd-sysv-generator[83392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:20:42 compute-2 systemd-rc-local-generator[83389]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:20:43 compute-2 systemd[1]: Starting Ceph rgw.rgw.compute-2.pkypgd for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:20:43 compute-2 podman[83447]: 2025-11-29 06:20:43.331066571 +0000 UTC m=+0.026905702 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:20:43 compute-2 podman[83447]: 2025-11-29 06:20:43.625197437 +0000 UTC m=+0.321036508 container create b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Nov 29 06:20:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.pkypgd supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:43 compute-2 podman[83447]: 2025-11-29 06:20:43.707028701 +0000 UTC m=+0.402867742 container init b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 29 06:20:43 compute-2 podman[83447]: 2025-11-29 06:20:43.716332693 +0000 UTC m=+0.412171744 container start b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:20:43 compute-2 bash[83447]: b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294
Nov 29 06:20:43 compute-2 systemd[1]: Started Ceph rgw.rgw.compute-2.pkypgd for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:20:43 compute-2 sudo[83220]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:43 compute-2 ceph-mon[77142]: pgmap v153: 177 pgs: 20 peering, 157 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:43 compute-2 ceph-mon[77142]: 4.11 deep-scrub starts
Nov 29 06:20:43 compute-2 ceph-mon[77142]: 4.11 deep-scrub ok
Nov 29 06:20:43 compute-2 radosgw[83467]: deferred set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:20:43 compute-2 radosgw[83467]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Nov 29 06:20:43 compute-2 radosgw[83467]: framework: beast
Nov 29 06:20:43 compute-2 radosgw[83467]: framework conf key: endpoint, val: 192.168.122.102:8082
Nov 29 06:20:43 compute-2 radosgw[83467]: init_numa not setting numa affinity
Nov 29 06:20:44 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Nov 29 06:20:44 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Nov 29 06:20:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Nov 29 06:20:46 compute-2 ceph-mon[77142]: pgmap v154: 177 pgs: 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0) v1
Nov 29 06:20:46 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 06:20:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Nov 29 06:20:47 compute-2 ceph-mon[77142]: 3.1a scrub starts
Nov 29 06:20:47 compute-2 ceph-mon[77142]: 3.1a scrub ok
Nov 29 06:20:47 compute-2 ceph-mon[77142]: 7.d scrub starts
Nov 29 06:20:47 compute-2 ceph-mon[77142]: 7.d scrub ok
Nov 29 06:20:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:47 compute-2 ceph-mon[77142]: pgmap v155: 177 pgs: 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:47 compute-2 ceph-mon[77142]: osdmap e45: 3 total, 3 up, 3 in
Nov 29 06:20:47 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 06:20:47 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 06:20:48 compute-2 ceph-mon[77142]: 7.12 scrub starts
Nov 29 06:20:48 compute-2 ceph-mon[77142]: 7.12 scrub ok
Nov 29 06:20:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:48 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Nov 29 06:20:48 compute-2 ceph-mon[77142]: osdmap e46: 3 total, 3 up, 3 in
Nov 29 06:20:48 compute-2 ceph-mon[77142]: pgmap v158: 178 pgs: 1 unknown, 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 06:20:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 06:20:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:48 compute-2 ceph-mon[77142]: Deploying daemon rgw.rgw.compute-1.cbugbv on compute-1
Nov 29 06:20:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Nov 29 06:20:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Nov 29 06:20:49 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 06:20:50 compute-2 ceph-mon[77142]: osdmap e47: 3 total, 3 up, 3 in
Nov 29 06:20:50 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 06:20:50 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 06:20:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Nov 29 06:20:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:51 compute-2 ceph-mon[77142]: pgmap v160: 179 pgs: 2 unknown, 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:51 compute-2 ceph-mon[77142]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Nov 29 06:20:51 compute-2 ceph-mon[77142]: osdmap e48: 3 total, 3 up, 3 in
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-2 ceph-mon[77142]: 7.15 scrub starts
Nov 29 06:20:51 compute-2 ceph-mon[77142]: 7.15 scrub ok
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Nov 29 06:20:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Nov 29 06:20:52 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-2 ceph-mon[77142]: Deploying daemon rgw.rgw.compute-0.vmptkp on compute-0
Nov 29 06:20:52 compute-2 ceph-mon[77142]: 4.12 scrub starts
Nov 29 06:20:52 compute-2 ceph-mon[77142]: 4.12 scrub ok
Nov 29 06:20:52 compute-2 ceph-mon[77142]: pgmap v162: 179 pgs: 1 creating+peering, 178 active+clean; 450 KiB data, 80 MiB used, 21 GiB / 21 GiB avail; 705 B/s rd, 705 B/s wr, 1 op/s
Nov 29 06:20:52 compute-2 ceph-mon[77142]: osdmap e49: 3 total, 3 up, 3 in
Nov 29 06:20:52 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1253186838' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-2 ceph-mon[77142]: 7.17 deep-scrub starts
Nov 29 06:20:52 compute-2 ceph-mon[77142]: 7.17 deep-scrub ok
Nov 29 06:20:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Nov 29 06:20:54 compute-2 ceph-mon[77142]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 06:20:54 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 06:20:54 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 06:20:54 compute-2 ceph-mon[77142]: osdmap e50: 3 total, 3 up, 3 in
Nov 29 06:20:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Nov 29 06:20:55 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.e deep-scrub starts
Nov 29 06:20:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Nov 29 06:20:56 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:56 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.e deep-scrub ok
Nov 29 06:20:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:57 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Nov 29 06:20:57 compute-2 ceph-mon[77142]: 4.16 scrub starts
Nov 29 06:20:57 compute-2 ceph-mon[77142]: 4.16 scrub ok
Nov 29 06:20:57 compute-2 ceph-mon[77142]: pgmap v165: 180 pgs: 1 unknown, 1 creating+peering, 178 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 841 B/s rd, 841 B/s wr, 1 op/s
Nov 29 06:20:57 compute-2 ceph-mon[77142]: 4.17 scrub starts
Nov 29 06:20:57 compute-2 ceph-mon[77142]: 4.17 scrub ok
Nov 29 06:20:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:57 compute-2 ceph-mon[77142]: osdmap e51: 3 total, 3 up, 3 in
Nov 29 06:20:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:57 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:57 compute-2 sudo[83538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:57 compute-2 sudo[83538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:57 compute-2 sudo[83538]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:57 compute-2 sudo[83563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:20:57 compute-2 sudo[83563]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:57 compute-2 sudo[83563]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:57 compute-2 sudo[83588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:57 compute-2 sudo[83588]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:57 compute-2 sudo[83588]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:57 compute-2 sudo[83613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:20:57 compute-2 sudo[83613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:57 compute-2 podman[83679]: 2025-11-29 06:20:57.969004908 +0000 UTC m=+0.057640653 container create a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:20:58 compute-2 podman[83679]: 2025-11-29 06:20:57.944300614 +0000 UTC m=+0.032936439 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:20:58 compute-2 systemd[1]: Started libpod-conmon-a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565.scope.
Nov 29 06:20:58 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:20:58 compute-2 podman[83679]: 2025-11-29 06:20:58.133434274 +0000 UTC m=+0.222070069 container init a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:20:58 compute-2 podman[83679]: 2025-11-29 06:20:58.139939023 +0000 UTC m=+0.228574768 container start a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True)
Nov 29 06:20:58 compute-2 podman[83679]: 2025-11-29 06:20:58.14366473 +0000 UTC m=+0.232300515 container attach a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:20:58 compute-2 flamboyant_germain[83696]: 167 167
Nov 29 06:20:58 compute-2 systemd[1]: libpod-a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565.scope: Deactivated successfully.
Nov 29 06:20:58 compute-2 podman[83679]: 2025-11-29 06:20:58.145114318 +0000 UTC m=+0.233750073 container died a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:20:58 compute-2 systemd[1]: var-lib-containers-storage-overlay-9611f9f647e39f6db2f9a13c55c0509de63dff63545797d591e76033d4ff9795-merged.mount: Deactivated successfully.
Nov 29 06:20:58 compute-2 podman[83679]: 2025-11-29 06:20:58.187893313 +0000 UTC m=+0.276529098 container remove a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Nov 29 06:20:58 compute-2 systemd[1]: libpod-conmon-a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565.scope: Deactivated successfully.
Nov 29 06:20:58 compute-2 systemd[1]: Reloading.
Nov 29 06:20:58 compute-2 systemd-rc-local-generator[83737]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:20:58 compute-2 systemd-sysv-generator[83743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:20:58 compute-2 ceph-mon[77142]: 5.e deep-scrub starts
Nov 29 06:20:58 compute-2 ceph-mon[77142]: pgmap v167: 181 pgs: 1 creating+peering, 1 unknown, 179 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 3.3 KiB/s rd, 402 B/s wr, 4 op/s
Nov 29 06:20:58 compute-2 ceph-mon[77142]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:20:58 compute-2 ceph-mon[77142]: 7.19 scrub starts
Nov 29 06:20:58 compute-2 ceph-mon[77142]: 7.19 scrub ok
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-2 ceph-mon[77142]: 4.1e scrub starts
Nov 29 06:20:58 compute-2 ceph-mon[77142]: 4.1e scrub ok
Nov 29 06:20:58 compute-2 ceph-mon[77142]: 5.e deep-scrub ok
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 06:20:58 compute-2 ceph-mon[77142]: osdmap e52: 3 total, 3 up, 3 in
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-2 ceph-mon[77142]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:58 compute-2 ceph-mon[77142]: Deploying daemon mds.cephfs.compute-2.gxdwyy on compute-2
Nov 29 06:20:58 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Nov 29 06:20:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Nov 29 06:20:58 compute-2 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-2 systemd[1]: Reloading.
Nov 29 06:20:58 compute-2 systemd-sysv-generator[83786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:20:58 compute-2 systemd-rc-local-generator[83782]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:20:58 compute-2 systemd[1]: Starting Ceph mds.cephfs.compute-2.gxdwyy for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:20:59 compute-2 podman[83841]: 2025-11-29 06:20:59.018956475 +0000 UTC m=+0.047474218 container create 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:20:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.gxdwyy supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:59 compute-2 podman[83841]: 2025-11-29 06:20:59.087589604 +0000 UTC m=+0.116107367 container init 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:20:59 compute-2 podman[83841]: 2025-11-29 06:20:59.092870652 +0000 UTC m=+0.121388385 container start 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:20:59 compute-2 podman[83841]: 2025-11-29 06:20:58.99995824 +0000 UTC m=+0.028475993 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:20:59 compute-2 bash[83841]: 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9
Nov 29 06:20:59 compute-2 systemd[1]: Started Ceph mds.cephfs.compute-2.gxdwyy for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:20:59 compute-2 ceph-mds[83861]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:20:59 compute-2 ceph-mds[83861]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Nov 29 06:20:59 compute-2 ceph-mds[83861]: main not setting numa affinity
Nov 29 06:20:59 compute-2 ceph-mds[83861]: pidfile_write: ignore empty --pid-file
Nov 29 06:20:59 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy[83857]: starting mds.cephfs.compute-2.gxdwyy at 
Nov 29 06:20:59 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 2 from mon.1
Nov 29 06:20:59 compute-2 sudo[83613]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e3 new map
Nov 29 06:21:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e3 print_map
                                           e3
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:19:35.589013+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.gxdwyy{-1:24145} state up:standby seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 3 from mon.1
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Monitors have assigned me to become a standby.
Nov 29 06:21:00 compute-2 ceph-mon[77142]: pgmap v169: 181 pgs: 1 creating+peering, 180 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 2.9 KiB/s rd, 346 B/s wr, 4 op/s
Nov 29 06:21:00 compute-2 ceph-mon[77142]: 7.1a deep-scrub starts
Nov 29 06:21:00 compute-2 ceph-mon[77142]: 7.1a deep-scrub ok
Nov 29 06:21:00 compute-2 ceph-mon[77142]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 06:21:00 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 06:21:00 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 06:21:00 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 06:21:00 compute-2 ceph-mon[77142]: osdmap e53: 3 total, 3 up, 3 in
Nov 29 06:21:00 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:21:00 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:21:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e4 new map
Nov 29 06:21:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e4 print_map
                                           e4
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:00.645745+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:creating seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 4 from mon.1
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.4 handle_mds_map i am now mds.0.4
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x1
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x100
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x600
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x601
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x602
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x603
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x604
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x605
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x606
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x607
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x608
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x609
Nov 29 06:21:00 compute-2 ceph-mds[83861]: mds.0.4 creating_done
Nov 29 06:21:00 compute-2 radosgw[83467]: LDAP not started since no server URIs were provided in the configuration.
Nov 29 06:21:00 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd[83463]: 2025-11-29T06:21:00.876+0000 7fdb615a0940 -1 LDAP not started since no server URIs were provided in the configuration.
Nov 29 06:21:00 compute-2 radosgw[83467]: framework: beast
Nov 29 06:21:00 compute-2 radosgw[83467]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Nov 29 06:21:00 compute-2 radosgw[83467]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Nov 29 06:21:00 compute-2 radosgw[83467]: starting handler: beast
Nov 29 06:21:00 compute-2 radosgw[83467]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:21:01 compute-2 radosgw[83467]: mgrc service_daemon_register rgw.24133 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.pkypgd,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=916ce3c8-b215-47fd-909b-03c5b552b52f,zone_name=default,zonegroup_id=a7fe8251-a74c-4f06-a680-d530d14bb192,zonegroup_name=default}
Nov 29 06:21:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:02 compute-2 ceph-mon[77142]: pgmap v171: 181 pgs: 1 creating+peering, 180 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 2.8 KiB/s rd, 341 B/s wr, 3 op/s
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 06:21:02 compute-2 ceph-mon[77142]: osdmap e54: 3 total, 3 up, 3 in
Nov 29 06:21:02 compute-2 ceph-mon[77142]: mds.? [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] up:boot
Nov 29 06:21:02 compute-2 ceph-mon[77142]: daemon mds.cephfs.compute-2.gxdwyy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Nov 29 06:21:02 compute-2 ceph-mon[77142]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Nov 29 06:21:02 compute-2 ceph-mon[77142]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Nov 29 06:21:02 compute-2 ceph-mon[77142]: Cluster is now healthy
Nov 29 06:21:02 compute-2 ceph-mon[77142]: fsmap cephfs:0 1 up:standby
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.gxdwyy"}]: dispatch
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:02 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:creating}
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 06:21:02 compute-2 ceph-mon[77142]: daemon mds.cephfs.compute-2.gxdwyy is now active in filesystem cephfs as rank 0
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 06:21:02 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:21:02 compute-2 ceph-mon[77142]: Deploying daemon mds.cephfs.compute-0.jzycnf on compute-0
Nov 29 06:21:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e5 new map
Nov 29 06:21:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e5 print_map
                                           e5
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
Nov 29 06:21:03 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 5 from mon.1
Nov 29 06:21:03 compute-2 ceph-mds[83861]: mds.0.4 handle_mds_map i am now mds.0.4
Nov 29 06:21:03 compute-2 ceph-mds[83861]: mds.0.4 handle_mds_map state change up:creating --> up:active
Nov 29 06:21:03 compute-2 ceph-mds[83861]: mds.0.4 recovery_done -- successful recovery!
Nov 29 06:21:03 compute-2 ceph-mds[83861]: mds.0.4 active_start
Nov 29 06:21:04 compute-2 ceph-mon[77142]: pgmap v173: 181 pgs: 181 active+clean; 452 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 1.2 KiB/s rd, 3.8 KiB/s wr, 13 op/s
Nov 29 06:21:04 compute-2 ceph-mon[77142]: 7.1c scrub starts
Nov 29 06:21:04 compute-2 ceph-mon[77142]: 7.1c scrub ok
Nov 29 06:21:04 compute-2 ceph-mon[77142]: 4.18 scrub starts
Nov 29 06:21:04 compute-2 ceph-mon[77142]: 4.18 scrub ok
Nov 29 06:21:05 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Nov 29 06:21:05 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Nov 29 06:21:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e6 new map
Nov 29 06:21:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e6 print_map
                                           e6
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:05 compute-2 ceph-mon[77142]: mds.? [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] up:active
Nov 29 06:21:05 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active}
Nov 29 06:21:05 compute-2 ceph-mon[77142]: 6.4 scrub starts
Nov 29 06:21:05 compute-2 ceph-mon[77142]: 6.4 scrub ok
Nov 29 06:21:05 compute-2 ceph-mon[77142]: pgmap v174: 181 pgs: 181 active+clean; 452 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 1.0 KiB/s rd, 3.3 KiB/s wr, 11 op/s
Nov 29 06:21:05 compute-2 ceph-mon[77142]: 4.13 scrub starts
Nov 29 06:21:05 compute-2 ceph-mon[77142]: 4.13 scrub ok
Nov 29 06:21:05 compute-2 ceph-mon[77142]: 6.6 scrub starts
Nov 29 06:21:05 compute-2 ceph-mon[77142]: 6.6 scrub ok
Nov 29 06:21:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e7 new map
Nov 29 06:21:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e7 print_map
                                           e7
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:07 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Nov 29 06:21:07 compute-2 ceph-mon[77142]: 5.12 deep-scrub starts
Nov 29 06:21:07 compute-2 ceph-mon[77142]: 5.12 deep-scrub ok
Nov 29 06:21:07 compute-2 ceph-mon[77142]: pgmap v175: 181 pgs: 181 active+clean; 456 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 122 KiB/s rd, 5.6 KiB/s wr, 219 op/s
Nov 29 06:21:07 compute-2 ceph-mon[77142]: 4.c scrub starts
Nov 29 06:21:07 compute-2 ceph-mon[77142]: 4.c scrub ok
Nov 29 06:21:07 compute-2 ceph-mon[77142]: mds.? [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] up:boot
Nov 29 06:21:07 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 1 up:standby
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.jzycnf"}]: dispatch
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:07 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 1 up:standby
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 06:21:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:21:07 compute-2 ceph-mon[77142]: Deploying daemon mds.cephfs.compute-1.vlqnad on compute-1
Nov 29 06:21:07 compute-2 ceph-mon[77142]: 6.9 deep-scrub starts
Nov 29 06:21:07 compute-2 ceph-mon[77142]: 6.9 deep-scrub ok
Nov 29 06:21:08 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Nov 29 06:21:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:08 compute-2 ceph-mon[77142]: osdmap e55: 3 total, 3 up, 3 in
Nov 29 06:21:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:08 compute-2 ceph-mon[77142]: osdmap e56: 3 total, 3 up, 3 in
Nov 29 06:21:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:08 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Nov 29 06:21:08 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Nov 29 06:21:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Nov 29 06:21:11 compute-2 ceph-mon[77142]: pgmap v177: 181 pgs: 181 active+clean; 456 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 122 KiB/s rd, 5.6 KiB/s wr, 219 op/s
Nov 29 06:21:11 compute-2 ceph-mon[77142]: 6.e scrub starts
Nov 29 06:21:11 compute-2 ceph-mon[77142]: 6.e scrub ok
Nov 29 06:21:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:12 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Nov 29 06:21:12 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Nov 29 06:21:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e8 new map
Nov 29 06:21:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e8 print_map
                                           e8
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
                                           [mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Nov 29 06:21:12 compute-2 ceph-mon[77142]: 2.13 scrub starts
Nov 29 06:21:12 compute-2 ceph-mon[77142]: 2.13 scrub ok
Nov 29 06:21:12 compute-2 ceph-mon[77142]: 6.b scrub starts
Nov 29 06:21:12 compute-2 ceph-mon[77142]: 6.b scrub ok
Nov 29 06:21:12 compute-2 ceph-mon[77142]: pgmap v179: 212 pgs: 31 unknown, 181 active+clean; 456 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 121 KiB/s rd, 2.7 KiB/s wr, 209 op/s
Nov 29 06:21:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:12 compute-2 ceph-mon[77142]: osdmap e57: 3 total, 3 up, 3 in
Nov 29 06:21:12 compute-2 ceph-mon[77142]: 6.c scrub starts
Nov 29 06:21:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:14 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Nov 29 06:21:14 compute-2 ceph-mon[77142]: pgmap v181: 212 pgs: 1 peering, 31 unknown, 180 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 198 op/s
Nov 29 06:21:14 compute-2 ceph-mon[77142]: 6.c scrub ok
Nov 29 06:21:14 compute-2 ceph-mon[77142]: 6.f scrub starts
Nov 29 06:21:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:14 compute-2 ceph-mon[77142]: 6.f scrub ok
Nov 29 06:21:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:14 compute-2 ceph-mon[77142]: mds.? [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] up:boot
Nov 29 06:21:14 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:21:14 compute-2 ceph-mon[77142]: osdmap e58: 3 total, 3 up, 3 in
Nov 29 06:21:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.vlqnad"}]: dispatch
Nov 29 06:21:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:15 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Nov 29 06:21:15 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Nov 29 06:21:16 compute-2 ceph-mon[77142]: 3.11 scrub starts
Nov 29 06:21:16 compute-2 ceph-mon[77142]: 3.11 scrub ok
Nov 29 06:21:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-2 ceph-mon[77142]: pgmap v183: 274 pgs: 1 peering, 93 unknown, 180 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 198 op/s
Nov 29 06:21:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-2 ceph-mon[77142]: osdmap e59: 3 total, 3 up, 3 in
Nov 29 06:21:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-2 ceph-mon[77142]: Deploying daemon haproxy.rgw.default.compute-0.zzbnoj on compute-0
Nov 29 06:21:16 compute-2 ceph-mon[77142]: 3.8 scrub starts
Nov 29 06:21:16 compute-2 ceph-mon[77142]: 3.8 scrub ok
Nov 29 06:21:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Nov 29 06:21:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Nov 29 06:21:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Nov 29 06:21:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:17 compute-2 ceph-mon[77142]: 7.6 scrub starts
Nov 29 06:21:17 compute-2 ceph-mon[77142]: 7.6 scrub ok
Nov 29 06:21:17 compute-2 ceph-mon[77142]: pgmap v185: 274 pgs: 31 unknown, 243 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 196 op/s
Nov 29 06:21:17 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:17 compute-2 ceph-mon[77142]: 3.0 scrub starts
Nov 29 06:21:17 compute-2 ceph-mon[77142]: 3.0 scrub ok
Nov 29 06:21:17 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:17 compute-2 ceph-mon[77142]: osdmap e60: 3 total, 3 up, 3 in
Nov 29 06:21:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e9 new map
Nov 29 06:21:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e9 print_map
                                           e9
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        9
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:17.214295+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
                                           [mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:17 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 9 from mon.1
Nov 29 06:21:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Nov 29 06:21:18 compute-2 ceph-mon[77142]: 6.8 scrub starts
Nov 29 06:21:18 compute-2 ceph-mon[77142]: 6.8 scrub ok
Nov 29 06:21:18 compute-2 ceph-mon[77142]: mds.? [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] up:standby
Nov 29 06:21:18 compute-2 ceph-mon[77142]: mds.? [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] up:active
Nov 29 06:21:18 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:21:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:18 compute-2 ceph-mon[77142]: osdmap e61: 3 total, 3 up, 3 in
Nov 29 06:21:20 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts
Nov 29 06:21:20 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok
Nov 29 06:21:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e10 new map
Nov 29 06:21:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).mds e10 print_map
                                           e10
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        9
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:17.214295+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
                                           [mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:21 compute-2 ceph-mon[77142]: pgmap v187: 305 pgs: 31 unknown, 274 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:21 compute-2 ceph-mon[77142]: 7.13 scrub starts
Nov 29 06:21:21 compute-2 ceph-mon[77142]: 7.13 scrub ok
Nov 29 06:21:21 compute-2 ceph-mon[77142]: 7.3 scrub starts
Nov 29 06:21:21 compute-2 ceph-mon[77142]: 7.3 scrub ok
Nov 29 06:21:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:23 compute-2 ceph-mon[77142]: pgmap v189: 305 pgs: 31 unknown, 274 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 6.d scrub starts
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 6.d scrub ok
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 5.0 deep-scrub starts
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 5.0 deep-scrub ok
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 7.18 scrub starts
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 7.18 scrub ok
Nov 29 06:21:23 compute-2 ceph-mon[77142]: mds.? [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] up:standby
Nov 29 06:21:23 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:21:23 compute-2 ceph-mon[77142]: pgmap v190: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 06:21:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 6.a scrub starts
Nov 29 06:21:23 compute-2 ceph-mon[77142]: 6.a scrub ok
Nov 29 06:21:24 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.d deep-scrub starts
Nov 29 06:21:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.3( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.6( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.e( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.8( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.16( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.2( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.10( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.16( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.a( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.19( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.a( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.11( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.1( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.13( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.17( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:25 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.d deep-scrub ok
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 06:21:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:25 compute-2 ceph-mon[77142]: osdmap e62: 3 total, 3 up, 3 in
Nov 29 06:21:25 compute-2 sshd-session[84434]: Accepted publickey for zuul from 192.168.122.30 port 41814 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:21:25 compute-2 systemd-logind[784]: New session 33 of user zuul.
Nov 29 06:21:25 compute-2 systemd[1]: Started Session 33 of User zuul.
Nov 29 06:21:25 compute-2 sshd-session[84434]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:21:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Nov 29 06:21:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 06:21:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:26.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 06:21:26 compute-2 sudo[84491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:21:26 compute-2 sudo[84491]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:26 compute-2 sudo[84491]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:26 compute-2 sudo[84539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:21:26 compute-2 sudo[84539]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:26 compute-2 sudo[84539]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:26 compute-2 sudo[84587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:21:26 compute-2 sudo[84587]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:26 compute-2 sudo[84587]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:26 compute-2 sudo[84630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:21:26 compute-2 sudo[84630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:26 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.b scrub starts
Nov 29 06:21:26 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.b scrub ok
Nov 29 06:21:26 compute-2 python3.9[84687]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:28.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:28 compute-2 sudo[84956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdgrmoysozathzatpmxifpdmtiaavdhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397288.2103617-64-102222089160382/AnsiballZ_command.py'
Nov 29 06:21:28 compute-2 sudo[84956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:28 compute-2 ceph-mon[77142]: 7.4 scrub starts
Nov 29 06:21:28 compute-2 ceph-mon[77142]: 7.4 scrub ok
Nov 29 06:21:28 compute-2 ceph-mon[77142]: pgmap v191: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:28 compute-2 ceph-mon[77142]: 5.d deep-scrub starts
Nov 29 06:21:28 compute-2 ceph-mon[77142]: 5.d deep-scrub ok
Nov 29 06:21:28 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:28 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:28 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 06:21:28 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:28 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:28 compute-2 ceph-mon[77142]: osdmap e63: 3 total, 3 up, 3 in
Nov 29 06:21:28 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:28 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:28 compute-2 ceph-mon[77142]: 5.b scrub starts
Nov 29 06:21:28 compute-2 ceph-mon[77142]: 5.b scrub ok
Nov 29 06:21:28 compute-2 python3.9[84958]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=62/64 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=62/64 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.c( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.3( v 59'99 lc 54'84 (0'0,59'99] local-lis/les=62/64 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=59'99 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.1c( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.1f( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.13( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.3( v 46'4 (0'0,46'4] local-lis/les=62/64 n=1 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.11( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.17( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.a( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.19( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.a( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.f( v 46'4 lc 0'0 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.5( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.9( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.16( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.15( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.2( v 46'4 (0'0,46'4] local-lis/les=62/64 n=1 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.b( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.16( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.d( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.e( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.6( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.3( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.8( v 54'2 lc 0'0 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:29 compute-2 ceph-mon[77142]: pgmap v194: 305 pgs: 9 peering, 296 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:29 compute-2 ceph-mon[77142]: Deploying daemon haproxy.rgw.default.compute-2.lpqgfx on compute-2
Nov 29 06:21:29 compute-2 ceph-mon[77142]: pgmap v195: 305 pgs: 9 peering, 296 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 109 B/s, 0 objects/s recovering
Nov 29 06:21:29 compute-2 ceph-mon[77142]: 7.e scrub starts
Nov 29 06:21:29 compute-2 ceph-mon[77142]: 7.e scrub ok
Nov 29 06:21:29 compute-2 ceph-mon[77142]: 6.2 scrub starts
Nov 29 06:21:29 compute-2 ceph-mon[77142]: 6.2 scrub ok
Nov 29 06:21:29 compute-2 ceph-mon[77142]: osdmap e64: 3 total, 3 up, 3 in
Nov 29 06:21:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:30.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:31 compute-2 ceph-mon[77142]: 7.9 scrub starts
Nov 29 06:21:31 compute-2 ceph-mon[77142]: 7.9 scrub ok
Nov 29 06:21:31 compute-2 ceph-mon[77142]: pgmap v197: 305 pgs: 40 peering, 265 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 145 B/s, 0 objects/s recovering
Nov 29 06:21:32 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:21:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:32.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:21:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Nov 29 06:21:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Nov 29 06:21:32 compute-2 ceph-mon[77142]: 7.10 scrub starts
Nov 29 06:21:32 compute-2 ceph-mon[77142]: 7.10 scrub ok
Nov 29 06:21:32 compute-2 ceph-mon[77142]: pgmap v198: 305 pgs: 31 peering, 274 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 111 B/s, 0 objects/s recovering
Nov 29 06:21:32 compute-2 ceph-mon[77142]: 4.1b scrub starts
Nov 29 06:21:32 compute-2 ceph-mon[77142]: 4.1b scrub ok
Nov 29 06:21:32 compute-2 ceph-mon[77142]: 5.8 scrub starts
Nov 29 06:21:32 compute-2 ceph-mon[77142]: 5.8 scrub ok
Nov 29 06:21:33 compute-2 podman[84737]: 2025-11-29 06:21:33.395221555 +0000 UTC m=+6.492017014 container create 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 06:21:33 compute-2 systemd[1]: Started libpod-conmon-30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901.scope.
Nov 29 06:21:33 compute-2 podman[84737]: 2025-11-29 06:21:33.378476669 +0000 UTC m=+6.475272198 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 29 06:21:33 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:21:33 compute-2 podman[84737]: 2025-11-29 06:21:33.516796364 +0000 UTC m=+6.613591833 container init 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 06:21:33 compute-2 podman[84737]: 2025-11-29 06:21:33.533869689 +0000 UTC m=+6.630665168 container start 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 06:21:33 compute-2 podman[84737]: 2025-11-29 06:21:33.538037088 +0000 UTC m=+6.634832537 container attach 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 06:21:33 compute-2 goofy_sanderson[85078]: 0 0
Nov 29 06:21:33 compute-2 systemd[1]: libpod-30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901.scope: Deactivated successfully.
Nov 29 06:21:33 compute-2 podman[84737]: 2025-11-29 06:21:33.544511737 +0000 UTC m=+6.641307216 container died 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 06:21:33 compute-2 systemd[1]: var-lib-containers-storage-overlay-b2aa4391ada0a5dd4a07574f11fd1e65c1794862a3f8f75f8b6449d4da86dc32-merged.mount: Deactivated successfully.
Nov 29 06:21:33 compute-2 podman[84737]: 2025-11-29 06:21:33.599004627 +0000 UTC m=+6.695800096 container remove 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 06:21:33 compute-2 systemd[1]: libpod-conmon-30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901.scope: Deactivated successfully.
Nov 29 06:21:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:21:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:34.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:21:34 compute-2 systemd[1]: Reloading.
Nov 29 06:21:34 compute-2 systemd-rc-local-generator[85129]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:21:34 compute-2 systemd-sysv-generator[85134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:21:34 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Nov 29 06:21:34 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Nov 29 06:21:34 compute-2 systemd[1]: Reloading.
Nov 29 06:21:34 compute-2 systemd-sysv-generator[85177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:21:34 compute-2 systemd-rc-local-generator[85174]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:21:34 compute-2 ceph-mon[77142]: 7.b scrub starts
Nov 29 06:21:34 compute-2 ceph-mon[77142]: 7.b scrub ok
Nov 29 06:21:34 compute-2 ceph-mon[77142]: 4.1a scrub starts
Nov 29 06:21:34 compute-2 ceph-mon[77142]: 4.1a scrub ok
Nov 29 06:21:34 compute-2 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.lpqgfx for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:21:35 compute-2 podman[85231]: 2025-11-29 06:21:35.202461671 +0000 UTC m=+0.024800177 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 29 06:21:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Nov 29 06:21:35 compute-2 podman[85231]: 2025-11-29 06:21:35.405236807 +0000 UTC m=+0.227575293 container create e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:21:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e077d830398853e06c0755f719428ef7611a0c98d04678e5362d34dcac8c5a5/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Nov 29 06:21:35 compute-2 podman[85231]: 2025-11-29 06:21:35.496653779 +0000 UTC m=+0.318992325 container init e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:21:35 compute-2 podman[85231]: 2025-11-29 06:21:35.502186934 +0000 UTC m=+0.324525440 container start e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:21:35 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx[85246]: [NOTICE] 332/062135 (2) : New worker #1 (4) forked
Nov 29 06:21:35 compute-2 bash[85231]: e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629
Nov 29 06:21:35 compute-2 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.lpqgfx for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:21:35 compute-2 sudo[84630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:36.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:36.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:38.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:38.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:39 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Nov 29 06:21:39 compute-2 ceph-mon[77142]: pgmap v199: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 118 B/s, 0 objects/s recovering
Nov 29 06:21:39 compute-2 ceph-mon[77142]: 3.15 scrub starts
Nov 29 06:21:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Nov 29 06:21:39 compute-2 ceph-mon[77142]: 3.15 scrub ok
Nov 29 06:21:39 compute-2 ceph-mon[77142]: 6.5 scrub starts
Nov 29 06:21:39 compute-2 ceph-mon[77142]: 6.5 scrub ok
Nov 29 06:21:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Nov 29 06:21:39 compute-2 ceph-mon[77142]: osdmap e65: 3 total, 3 up, 3 in
Nov 29 06:21:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:40.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:21:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:40.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.7( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.1b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:41 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Nov 29 06:21:41 compute-2 sudo[84956]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:41 compute-2 sshd-session[84437]: Connection closed by 192.168.122.30 port 41814
Nov 29 06:21:41 compute-2 sshd-session[84434]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:21:41 compute-2 systemd[1]: session-33.scope: Deactivated successfully.
Nov 29 06:21:41 compute-2 systemd[1]: session-33.scope: Consumed 9.673s CPU time.
Nov 29 06:21:41 compute-2 systemd-logind[784]: Session 33 logged out. Waiting for processes to exit.
Nov 29 06:21:41 compute-2 systemd-logind[784]: Removed session 33.
Nov 29 06:21:42 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Nov 29 06:21:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:42.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:42.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e66 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:44.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:44.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Nov 29 06:21:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:46.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:46 compute-2 ceph-mon[77142]: pgmap v201: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:21:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 4.e scrub starts
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 4.e scrub ok
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 6.3 deep-scrub starts
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 6.3 deep-scrub ok
Nov 29 06:21:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:46 compute-2 ceph-mon[77142]: pgmap v202: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 11 B/s, 0 objects/s recovering
Nov 29 06:21:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 7.f scrub starts
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 7.f scrub ok
Nov 29 06:21:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 06:21:46 compute-2 ceph-mon[77142]: pgmap v203: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 6.7 scrub starts
Nov 29 06:21:46 compute-2 ceph-mon[77142]: 6.7 scrub ok
Nov 29 06:21:46 compute-2 ceph-mon[77142]: osdmap e66: 3 total, 3 up, 3 in
Nov 29 06:21:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 06:21:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:47 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.e scrub starts
Nov 29 06:21:47 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.e scrub ok
Nov 29 06:21:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:21:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:48.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:21:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Nov 29 06:21:48 compute-2 ceph-mon[77142]: 5.13 scrub starts
Nov 29 06:21:48 compute-2 ceph-mon[77142]: pgmap v205: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Nov 29 06:21:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:48 compute-2 ceph-mon[77142]: 7.8 scrub starts
Nov 29 06:21:48 compute-2 ceph-mon[77142]: 7.8 scrub ok
Nov 29 06:21:48 compute-2 ceph-mon[77142]: pgmap v206: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:48 compute-2 ceph-mon[77142]: 4.d scrub starts
Nov 29 06:21:48 compute-2 ceph-mon[77142]: 4.d scrub ok
Nov 29 06:21:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 06:21:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 06:21:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:48 compute-2 ceph-mon[77142]: osdmap e67: 3 total, 3 up, 3 in
Nov 29 06:21:48 compute-2 ceph-mon[77142]: 3.16 scrub starts
Nov 29 06:21:48 compute-2 ceph-mon[77142]: 3.16 scrub ok
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.7( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:48 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.7( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:48.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:49 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Nov 29 06:21:49 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Nov 29 06:21:50 compute-2 sudo[85294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:21:50 compute-2 sudo[85294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:50 compute-2 sudo[85294]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:50 compute-2 sudo[85319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:21:50 compute-2 sudo[85319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:50 compute-2 sudo[85319]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:50.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:50 compute-2 sudo[85344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:21:50 compute-2 sudo[85344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:50 compute-2 sudo[85344]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:50 compute-2 sudo[85369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:21:50 compute-2 sudo[85369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:50 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Nov 29 06:21:50 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Nov 29 06:21:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:50.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Nov 29 06:21:51 compute-2 ceph-mon[77142]: 5.13 scrub ok
Nov 29 06:21:51 compute-2 ceph-mon[77142]: 7.1e scrub starts
Nov 29 06:21:51 compute-2 ceph-mon[77142]: pgmap v208: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:51 compute-2 ceph-mon[77142]: 7.1e scrub ok
Nov 29 06:21:51 compute-2 ceph-mon[77142]: 7.1b scrub starts
Nov 29 06:21:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:51 compute-2 ceph-mon[77142]: 5.10 scrub starts
Nov 29 06:21:51 compute-2 ceph-mon[77142]: 5.10 scrub ok
Nov 29 06:21:51 compute-2 ceph-mon[77142]: 7.1b scrub ok
Nov 29 06:21:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Nov 29 06:21:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:52.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:21:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:52.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:21:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:54.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:54.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 3.e scrub starts
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 3.e scrub ok
Nov 29 06:21:55 compute-2 ceph-mon[77142]: pgmap v209: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 5.1a scrub starts
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 5.1a scrub ok
Nov 29 06:21:55 compute-2 ceph-mon[77142]: osdmap e68: 3 total, 3 up, 3 in
Nov 29 06:21:55 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 06:21:55 compute-2 ceph-mon[77142]: Deploying daemon keepalived.rgw.default.compute-2.klqjoa on compute-2
Nov 29 06:21:55 compute-2 ceph-mon[77142]: pgmap v211: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 3.1d scrub starts
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 3.1d scrub ok
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 7.2 scrub starts
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 7.2 scrub ok
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 5.11 scrub starts
Nov 29 06:21:55 compute-2 ceph-mon[77142]: 5.11 scrub ok
Nov 29 06:21:55 compute-2 ceph-mon[77142]: pgmap v212: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:55 compute-2 ceph-mon[77142]: osdmap e69: 3 total, 3 up, 3 in
Nov 29 06:21:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Nov 29 06:21:55 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 70 pg[9.17( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=70) [2] r=0 lpr=70 pi=[58,70)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:55 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 70 pg[9.17( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=70) [2] r=0 lpr=70 pi=[58,70)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-mon[77142]: 3.14 scrub starts
Nov 29 06:21:56 compute-2 ceph-mon[77142]: 3.14 scrub ok
Nov 29 06:21:56 compute-2 ceph-mon[77142]: pgmap v214: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:56 compute-2 ceph-mon[77142]: osdmap e70: 3 total, 3 up, 3 in
Nov 29 06:21:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:56.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Nov 29 06:21:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:56.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.7( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.13( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.7( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.3( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.13( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.3( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:56 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.17( v 56'1130 (0'0,56'1130] local-lis/les=70/71 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=70) [2] r=0 lpr=70 pi=[58,70)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:57 compute-2 sshd-session[85499]: Accepted publickey for zuul from 192.168.122.30 port 35394 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:21:57 compute-2 systemd-logind[784]: New session 34 of user zuul.
Nov 29 06:21:57 compute-2 systemd[1]: Started Session 34 of User zuul.
Nov 29 06:21:57 compute-2 sshd-session[85499]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:21:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:58.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:58 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Nov 29 06:21:58 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Nov 29 06:21:58 compute-2 python3.9[85652]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 06:21:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:21:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:58.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:58 compute-2 ceph-mon[77142]: 5.1c scrub starts
Nov 29 06:21:58 compute-2 ceph-mon[77142]: 5.1c scrub ok
Nov 29 06:21:58 compute-2 ceph-mon[77142]: pgmap v216: 305 pgs: 6 active+remapped, 1 active+recovering+remapped, 1 unknown, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 49 KiB/s rd, 984 B/s wr, 87 op/s; 6/210 objects misplaced (2.857%); 120 B/s, 4 objects/s recovering
Nov 29 06:21:58 compute-2 ceph-mon[77142]: osdmap e71: 3 total, 3 up, 3 in
Nov 29 06:21:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Nov 29 06:21:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:59 compute-2 python3.9[85841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.7( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.13( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.3( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:00 compute-2 ceph-mon[77142]: 3.1c scrub starts
Nov 29 06:22:00 compute-2 ceph-mon[77142]: pgmap v218: 305 pgs: 6 active+remapped, 1 active+recovering+remapped, 1 unknown, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 59 KiB/s rd, 1.2 KiB/s wr, 106 op/s; 6/210 objects misplaced (2.857%); 146 B/s, 4 objects/s recovering
Nov 29 06:22:00 compute-2 ceph-mon[77142]: 3.1c scrub ok
Nov 29 06:22:00 compute-2 ceph-mon[77142]: 3.1b scrub starts
Nov 29 06:22:00 compute-2 ceph-mon[77142]: 3.1b scrub ok
Nov 29 06:22:00 compute-2 ceph-mon[77142]: osdmap e72: 3 total, 3 up, 3 in
Nov 29 06:22:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:00.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:00 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Nov 29 06:22:00 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Nov 29 06:22:00 compute-2 podman[85434]: 2025-11-29 06:22:00.825529697 +0000 UTC m=+10.288389575 container create b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2)
Nov 29 06:22:00 compute-2 systemd[1]: Started libpod-conmon-b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf.scope.
Nov 29 06:22:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:00.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:00 compute-2 podman[85434]: 2025-11-29 06:22:00.806166755 +0000 UTC m=+10.269026633 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 29 06:22:00 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:22:00 compute-2 sudo[86014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqfmhctmgswxxoqulrshoynxwhpgguza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397320.5306985-101-166520987867899/AnsiballZ_command.py'
Nov 29 06:22:00 compute-2 sudo[86014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:01 compute-2 python3.9[86016]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:22:01 compute-2 sudo[86014]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:01 compute-2 podman[85434]: 2025-11-29 06:22:01.675647722 +0000 UTC m=+11.138507620 container init b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, com.redhat.component=keepalived-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, name=keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, release=1793, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 29 06:22:01 compute-2 podman[85434]: 2025-11-29 06:22:01.685302037 +0000 UTC m=+11.148161925 container start b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=keepalived, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, release=1793, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, version=2.2.4)
Nov 29 06:22:01 compute-2 clever_pike[85981]: 0 0
Nov 29 06:22:01 compute-2 systemd[1]: libpod-b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf.scope: Deactivated successfully.
Nov 29 06:22:02 compute-2 podman[85434]: 2025-11-29 06:22:02.040149735 +0000 UTC m=+11.503009613 container attach b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, com.redhat.component=keepalived-container, release=1793, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, version=2.2.4, distribution-scope=public, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived)
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 3.12 scrub starts
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 3.12 scrub ok
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 5.7 scrub starts
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 5.7 scrub ok
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 3.13 scrub starts
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 3.13 scrub ok
Nov 29 06:22:02 compute-2 ceph-mon[77142]: pgmap v220: 305 pgs: 7 peering, 298 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 67 KiB/s rd, 1.3 KiB/s wr, 121 op/s; 0 B/s, 0 objects/s recovering
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 4.19 scrub starts
Nov 29 06:22:02 compute-2 ceph-mon[77142]: 4.19 scrub ok
Nov 29 06:22:02 compute-2 podman[85434]: 2025-11-29 06:22:02.041710283 +0000 UTC m=+11.504570171 container died b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., release=1793, io.openshift.tags=Ceph keepalived, version=2.2.4, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph.)
Nov 29 06:22:02 compute-2 sudo[86180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltcatdvaxvuosvosofoqgyxjmfdhecgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397321.6382914-137-240605228535402/AnsiballZ_stat.py'
Nov 29 06:22:02 compute-2 sudo[86180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:02.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:02 compute-2 python3.9[86183]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:22:02 compute-2 sudo[86180]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:02 compute-2 systemd[1]: var-lib-containers-storage-overlay-336a23a73ed125d91fb69388f54c6e326bced95e89e171120698e6c148a33348-merged.mount: Deactivated successfully.
Nov 29 06:22:02 compute-2 podman[85434]: 2025-11-29 06:22:02.860731501 +0000 UTC m=+12.323591419 container remove b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, name=keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., version=2.2.4, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 06:22:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:02.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:02 compute-2 systemd[1]: libpod-conmon-b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf.scope: Deactivated successfully.
Nov 29 06:22:03 compute-2 systemd[1]: Reloading.
Nov 29 06:22:03 compute-2 ceph-mon[77142]: 3.17 scrub starts
Nov 29 06:22:03 compute-2 ceph-mon[77142]: 3.17 scrub ok
Nov 29 06:22:03 compute-2 ceph-mon[77142]: 5.1b scrub starts
Nov 29 06:22:03 compute-2 ceph-mon[77142]: 5.1b scrub ok
Nov 29 06:22:03 compute-2 ceph-mon[77142]: 3.18 scrub starts
Nov 29 06:22:03 compute-2 ceph-mon[77142]: 3.18 scrub ok
Nov 29 06:22:03 compute-2 ceph-mon[77142]: pgmap v221: 305 pgs: 7 peering, 298 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 60 KiB/s rd, 1.2 KiB/s wr, 108 op/s; 0 B/s, 0 objects/s recovering
Nov 29 06:22:03 compute-2 systemd-sysv-generator[86343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:22:03 compute-2 systemd-rc-local-generator[86339]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:22:03 compute-2 systemd[1]: Reloading.
Nov 29 06:22:03 compute-2 systemd-rc-local-generator[86407]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:22:03 compute-2 systemd-sysv-generator[86410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:22:03 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Nov 29 06:22:03 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Nov 29 06:22:03 compute-2 sudo[86381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnrjvqjaenselskcyjsuocsvsrbezpwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397322.7265117-170-16466175519826/AnsiballZ_file.py'
Nov 29 06:22:03 compute-2 sudo[86381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:03 compute-2 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.klqjoa for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:22:03 compute-2 python3.9[86419]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:22:03 compute-2 sudo[86381]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:03 compute-2 podman[86465]: 2025-11-29 06:22:03.733275192 +0000 UTC m=+0.021292719 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 29 06:22:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:04.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:04 compute-2 sudo[86627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnywazupsxdlzuecsiuixuswgofbvzhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397324.0218043-197-219516557009589/AnsiballZ_file.py'
Nov 29 06:22:04 compute-2 sudo[86627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:04.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:05 compute-2 python3.9[86629]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:22:05 compute-2 sudo[86627]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:05 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1d deep-scrub starts
Nov 29 06:22:05 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1d deep-scrub ok
Nov 29 06:22:05 compute-2 podman[86465]: 2025-11-29 06:22:05.634933754 +0000 UTC m=+1.922951261 container create d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.component=keepalived-container, release=1793, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.buildah.version=1.28.2, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4)
Nov 29 06:22:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:06 compute-2 python3.9[86780]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:22:06 compute-2 network[86797]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:22:06 compute-2 network[86798]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:22:06 compute-2 network[86799]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:22:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:06.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 3.1 scrub starts
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 3.1 scrub ok
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 5.f scrub starts
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 5.f scrub ok
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 7.14 scrub starts
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 7.14 scrub ok
Nov 29 06:22:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/225a3e51dd02ee070057392a07ad23b72c8c2d48a7d4402bcf59dce250c46a6c/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:22:06 compute-2 podman[86465]: 2025-11-29 06:22:06.256619708 +0000 UTC m=+2.544637245 container init d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, io.openshift.expose-services=, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, release=1793, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Nov 29 06:22:06 compute-2 podman[86465]: 2025-11-29 06:22:06.262544793 +0000 UTC m=+2.550562310 container start d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, distribution-scope=public, io.buildah.version=1.28.2, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, vendor=Red Hat, Inc., release=1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Nov 29 06:22:06 compute-2 bash[86465]: d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Starting Keepalived v2.2.4 (08/21,2021)
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Running on Linux 5.14.0-642.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025 (built for Linux 5.14.0)
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Configuration file /etc/keepalived/keepalived.conf
Nov 29 06:22:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Starting VRRP child process, pid=4
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Startup complete
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: (VI_0) Entering BACKUP STATE (init)
Nov 29 06:22:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.5( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:06 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: VRRP_Script(check_backend) succeeded
Nov 29 06:22:06 compute-2 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.klqjoa for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:22:06 compute-2 sudo[85369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:06.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 5.14 scrub starts
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 5.14 scrub ok
Nov 29 06:22:06 compute-2 ceph-mon[77142]: pgmap v222: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 6.2 KiB/s rd, 127 B/s wr, 11 op/s; 41 B/s, 1 objects/s recovering
Nov 29 06:22:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 5.1f scrub starts
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 5.1f scrub ok
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 7.1d deep-scrub starts
Nov 29 06:22:06 compute-2 ceph-mon[77142]: 7.1d deep-scrub ok
Nov 29 06:22:06 compute-2 ceph-mon[77142]: pgmap v223: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 5.4 KiB/s rd, 110 B/s wr, 9 op/s; 35 B/s, 0 objects/s recovering
Nov 29 06:22:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 06:22:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 06:22:06 compute-2 ceph-mon[77142]: osdmap e73: 3 total, 3 up, 3 in
Nov 29 06:22:07 compute-2 sshd-session[86825]: Invalid user testuser from 92.118.39.92 port 36744
Nov 29 06:22:07 compute-2 sshd-session[86825]: Connection closed by invalid user testuser 92.118.39.92 port 36744 [preauth]
Nov 29 06:22:07 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.5( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.5( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:07 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:08 compute-2 ceph-mon[77142]: 5.17 scrub starts
Nov 29 06:22:08 compute-2 ceph-mon[77142]: 5.17 scrub ok
Nov 29 06:22:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:08 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 06:22:08 compute-2 ceph-mon[77142]: osdmap e74: 3 total, 3 up, 3 in
Nov 29 06:22:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:08.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:08.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:09 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Nov 29 06:22:09 compute-2 ceph-mon[77142]: 5.1e scrub starts
Nov 29 06:22:09 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:09 compute-2 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 06:22:09 compute-2 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 06:22:09 compute-2 ceph-mon[77142]: Deploying daemon keepalived.rgw.default.compute-0.uyqrbs on compute-0
Nov 29 06:22:09 compute-2 ceph-mon[77142]: 5.1e scrub ok
Nov 29 06:22:09 compute-2 ceph-mon[77142]: pgmap v226: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 151 B/s, 4 objects/s recovering
Nov 29 06:22:09 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Nov 29 06:22:09 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Nov 29 06:22:09 compute-2 ceph-mon[77142]: osdmap e75: 3 total, 3 up, 3 in
Nov 29 06:22:09 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:09 2025: (VI_0) Entering MASTER STATE
Nov 29 06:22:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:10.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:10 compute-2 python3.9[87072]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:22:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Nov 29 06:22:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:10.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:11 compute-2 python3.9[87223]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:22:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:12 compute-2 ceph-mon[77142]: pgmap v228: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Nov 29 06:22:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:12.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Nov 29 06:22:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:12.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:13 compute-2 python3.9[87378]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.5( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.5( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:13 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:13 compute-2 ceph-mon[77142]: osdmap e76: 3 total, 3 up, 3 in
Nov 29 06:22:13 compute-2 ceph-mon[77142]: 5.1d scrub starts
Nov 29 06:22:13 compute-2 ceph-mon[77142]: 5.1d scrub ok
Nov 29 06:22:13 compute-2 ceph-mon[77142]: pgmap v230: 305 pgs: 1 active+recovering+remapped, 1 active+remapped, 2 active+recovery_wait+remapped, 301 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 38 B/s, 1 objects/s recovering
Nov 29 06:22:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Nov 29 06:22:13 compute-2 ceph-mon[77142]: osdmap e77: 3 total, 3 up, 3 in
Nov 29 06:22:13 compute-2 sudo[87534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrsgplisxlitrkeqkzzdhexnnrpzdujx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397333.6661682-341-81170183932669/AnsiballZ_setup.py'
Nov 29 06:22:13 compute-2 sudo[87534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:14.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:14 compute-2 python3.9[87536]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:22:14 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Nov 29 06:22:14 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Nov 29 06:22:14 compute-2 sudo[87534]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:14.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:14 compute-2 sudo[87619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tunmukkgqbxrankbcftvgvggiiqvvmaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397333.6661682-341-81170183932669/AnsiballZ_dnf.py'
Nov 29 06:22:14 compute-2 sudo[87619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:15 compute-2 python3.9[87621]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:22:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:16.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.a scrub starts
Nov 29 06:22:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.a scrub ok
Nov 29 06:22:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:16.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Nov 29 06:22:17 compute-2 ceph-mon[77142]: 5.15 scrub starts
Nov 29 06:22:17 compute-2 ceph-mon[77142]: 5.15 scrub ok
Nov 29 06:22:17 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Nov 29 06:22:17 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Nov 29 06:22:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:18.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 3.19 scrub starts
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 3.19 scrub ok
Nov 29 06:22:18 compute-2 ceph-mon[77142]: pgmap v232: 305 pgs: 1 active+recovering+remapped, 1 active+remapped, 2 active+recovery_wait+remapped, 301 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 36 B/s, 1 objects/s recovering
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 7.5 scrub starts
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 7.5 scrub ok
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 5.18 scrub starts
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 5.18 scrub ok
Nov 29 06:22:18 compute-2 ceph-mon[77142]: pgmap v233: 305 pgs: 1 active+recovering+remapped, 5 active+remapped, 2 active+recovery_wait+remapped, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 63 B/s, 4 objects/s recovering
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 5.1 scrub starts
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 5.1 scrub ok
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 7.a scrub starts
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 7.a scrub ok
Nov 29 06:22:18 compute-2 ceph-mon[77142]: osdmap e78: 3 total, 3 up, 3 in
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 3.5 scrub starts
Nov 29 06:22:18 compute-2 ceph-mon[77142]: 3.5 scrub ok
Nov 29 06:22:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Nov 29 06:22:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.5( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Nov 29 06:22:18 compute-2 sudo[87680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:18 compute-2 sudo[87680]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-2 sudo[87680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:18 compute-2 sudo[87707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:18 compute-2 sudo[87707]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-2 sudo[87707]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:18 compute-2 sudo[87734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:18 compute-2 sudo[87734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-2 sudo[87734]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:18 compute-2 sudo[87759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:22:18 compute-2 sudo[87759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-2 sudo[87759]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:18.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:18 compute-2 sudo[87784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:18 compute-2 sudo[87784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-2 sudo[87784]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:18 compute-2 sudo[87809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:22:18 compute-2 sudo[87809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-2 sudo[87809]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:19 compute-2 sudo[87837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:19 compute-2 sudo[87837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:19 compute-2 sudo[87837]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:19 compute-2 sudo[87862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:22:19 compute-2 sudo[87862]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:19 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Nov 29 06:22:19 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Nov 29 06:22:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:20.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:20 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:20 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Nov 29 06:22:20 compute-2 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:20 2025: (VI_0) Entering BACKUP STATE
Nov 29 06:22:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:20.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:21 compute-2 podman[87957]: 2025-11-29 06:22:21.341349103 +0000 UTC m=+1.871605174 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:22:21 compute-2 podman[87957]: 2025-11-29 06:22:21.442188227 +0000 UTC m=+1.972444268 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:22:22 compute-2 podman[88109]: 2025-11-29 06:22:22.136202761 +0000 UTC m=+0.101623064 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:22:22 compute-2 podman[88109]: 2025-11-29 06:22:22.146192214 +0000 UTC m=+0.111612497 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:22:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:22.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:22 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Nov 29 06:22:22 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Nov 29 06:22:22 compute-2 podman[88171]: 2025-11-29 06:22:22.370789302 +0000 UTC m=+0.051910125 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived)
Nov 29 06:22:22 compute-2 podman[88171]: 2025-11-29 06:22:22.40934683 +0000 UTC m=+0.090467643 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, name=keepalived, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793)
Nov 29 06:22:22 compute-2 sudo[87862]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:22.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:23 compute-2 ceph-mon[77142]: 7.16 scrub starts
Nov 29 06:22:23 compute-2 ceph-mon[77142]: 7.16 scrub ok
Nov 29 06:22:23 compute-2 ceph-mon[77142]: 5.5 scrub starts
Nov 29 06:22:23 compute-2 ceph-mon[77142]: 5.5 scrub ok
Nov 29 06:22:23 compute-2 ceph-mon[77142]: pgmap v235: 305 pgs: 1 active+recovering+remapped, 5 active+remapped, 2 active+recovery_wait+remapped, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 30 B/s, 2 objects/s recovering
Nov 29 06:22:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:23 compute-2 ceph-mon[77142]: osdmap e79: 3 total, 3 up, 3 in
Nov 29 06:22:23 compute-2 ceph-mon[77142]: 7.1f scrub starts
Nov 29 06:22:23 compute-2 ceph-mon[77142]: 7.1f scrub ok
Nov 29 06:22:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:23 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Nov 29 06:22:23 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 3.4 scrub starts
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 7.11 scrub starts
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 7.11 scrub ok
Nov 29 06:22:24 compute-2 ceph-mon[77142]: pgmap v237: 305 pgs: 4 peering, 4 active+remapped, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 78 B/s, 4 objects/s recovering
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 3.4 scrub ok
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 3.1e scrub starts
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 3.1e scrub ok
Nov 29 06:22:24 compute-2 ceph-mon[77142]: pgmap v238: 305 pgs: 4 peering, 301 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 74 B/s, 4 objects/s recovering
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 6.1 scrub starts
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 6.1 scrub ok
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 3.7 scrub starts
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 3.7 scrub ok
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 4.14 scrub starts
Nov 29 06:22:24 compute-2 ceph-mon[77142]: 4.14 scrub ok
Nov 29 06:22:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-2 sudo[88210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:24 compute-2 sudo[88210]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-2 sudo[88210]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:24 compute-2 sudo[88235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:22:24 compute-2 sudo[88235]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-2 sudo[88235]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:24.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:24 compute-2 sudo[88260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:24 compute-2 sudo[88260]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-2 sudo[88260]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:24 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Nov 29 06:22:24 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Nov 29 06:22:24 compute-2 sudo[88285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:22:24 compute-2 sudo[88285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-2 sudo[88285]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:24.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:25 compute-2 ceph-mon[77142]: pgmap v239: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 46 B/s, 1 objects/s recovering
Nov 29 06:22:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Nov 29 06:22:25 compute-2 ceph-mon[77142]: 4.1d scrub starts
Nov 29 06:22:25 compute-2 ceph-mon[77142]: 4.1d scrub ok
Nov 29 06:22:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Nov 29 06:22:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:26.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:26 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 80 pg[9.18( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=80) [2] r=0 lpr=80 pi=[58,80)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:26 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 80 pg[9.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=80) [2] r=0 lpr=80 pi=[58,80)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Nov 29 06:22:26 compute-2 ceph-mon[77142]: osdmap e80: 3 total, 3 up, 3 in
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Nov 29 06:22:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:26.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:27 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Nov 29 06:22:27 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 81 pg[9.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=81) [2] r=0 lpr=81 pi=[58,81)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:27 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 81 pg[9.9( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=81) [2] r=0 lpr=81 pi=[58,81)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:27 compute-2 ceph-mon[77142]: pgmap v241: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 46 B/s, 1 objects/s recovering
Nov 29 06:22:27 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Nov 29 06:22:27 compute-2 ceph-mon[77142]: osdmap e81: 3 total, 3 up, 3 in
Nov 29 06:22:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:28.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.9( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.9( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.18( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.18( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:28 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Nov 29 06:22:29 compute-2 ceph-mon[77142]: pgmap v243: 305 pgs: 305 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Nov 29 06:22:29 compute-2 ceph-mon[77142]: 3.d scrub starts
Nov 29 06:22:29 compute-2 ceph-mon[77142]: 3.d scrub ok
Nov 29 06:22:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Nov 29 06:22:29 compute-2 ceph-mon[77142]: osdmap e82: 3 total, 3 up, 3 in
Nov 29 06:22:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:30.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Nov 29 06:22:30 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.9( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:30 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.9( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:30 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:30 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:30.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:31 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Nov 29 06:22:31 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Nov 29 06:22:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:32.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Nov 29 06:22:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Nov 29 06:22:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:32.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:33 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Nov 29 06:22:33 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Nov 29 06:22:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:34.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:34.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:36.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:36 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:22:36 compute-2 ceph-mon[77142]: osdmap e83: 3 total, 3 up, 3 in
Nov 29 06:22:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:36.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:37 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Nov 29 06:22:37 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Nov 29 06:22:37 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.18( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:37 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.18( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:37 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.8( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:37 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.8( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:37 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Nov 29 06:22:37 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.9( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:37 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:38 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Nov 29 06:22:38 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Nov 29 06:22:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:38.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:38 compute-2 sudo[88377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:38 compute-2 sudo[88377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:38 compute-2 sudo[88377]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:38 compute-2 sudo[88402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:38 compute-2 sudo[88402]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:38 compute-2 sudo[88402]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:38.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:39 compute-2 ceph-mon[77142]: pgmap v246: 305 pgs: 4 unknown, 301 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:39 compute-2 ceph-mon[77142]: osdmap e84: 3 total, 3 up, 3 in
Nov 29 06:22:39 compute-2 ceph-mon[77142]: pgmap v248: 305 pgs: 4 active+remapped, 2 remapped+peering, 299 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail; 112 B/s, 4 objects/s recovering
Nov 29 06:22:39 compute-2 ceph-mon[77142]: 3.6 scrub starts
Nov 29 06:22:39 compute-2 ceph-mon[77142]: pgmap v249: 305 pgs: 2 peering, 2 active+remapped, 2 remapped+peering, 299 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail; 36 B/s, 1 objects/s recovering
Nov 29 06:22:39 compute-2 ceph-mon[77142]: 5.3 scrub starts
Nov 29 06:22:39 compute-2 ceph-mon[77142]: 3.6 scrub ok
Nov 29 06:22:39 compute-2 ceph-mon[77142]: 5.3 scrub ok
Nov 29 06:22:39 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Nov 29 06:22:39 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Nov 29 06:22:39 compute-2 systemd[72593]: Created slice User Background Tasks Slice.
Nov 29 06:22:39 compute-2 systemd[72593]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 06:22:39 compute-2 systemd[72593]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 06:22:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:40.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:40.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Nov 29 06:22:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 86 pg[9.18( v 56'1130 (0'0,56'1130] local-lis/les=85/86 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 86 pg[9.8( v 56'1130 (0'0,56'1130] local-lis/les=85/86 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:41 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.f deep-scrub starts
Nov 29 06:22:41 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.f deep-scrub ok
Nov 29 06:22:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:42.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 4.15 scrub starts
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 4.15 scrub ok
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 4.1f scrub starts
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 4.1f scrub ok
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 4.1c scrub starts
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 4.1c scrub ok
Nov 29 06:22:42 compute-2 ceph-mon[77142]: pgmap v250: 305 pgs: 2 peering, 4 active+remapped, 299 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail; 26 KiB/s rd, 530 B/s wr, 47 op/s; 56 B/s, 3 objects/s recovering
Nov 29 06:22:42 compute-2 ceph-mon[77142]: osdmap e85: 3 total, 3 up, 3 in
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 10.4 scrub starts
Nov 29 06:22:42 compute-2 ceph-mon[77142]: 10.4 scrub ok
Nov 29 06:22:42 compute-2 ceph-mon[77142]: pgmap v252: 305 pgs: 2 peering, 4 active+remapped, 299 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 25 KiB/s rd, 511 B/s wr, 45 op/s; 54 B/s, 2 objects/s recovering
Nov 29 06:22:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:42.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:44.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:44 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 11.3 scrub starts
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 11.3 scrub ok
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 8.6 scrub starts
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 8.6 scrub ok
Nov 29 06:22:44 compute-2 ceph-mon[77142]: pgmap v253: 305 pgs: 2 peering, 2 active+remapped, 301 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 5.6 scrub starts
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 5.6 scrub ok
Nov 29 06:22:44 compute-2 ceph-mon[77142]: osdmap e86: 3 total, 3 up, 3 in
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 10.f deep-scrub starts
Nov 29 06:22:44 compute-2 ceph-mon[77142]: 10.f deep-scrub ok
Nov 29 06:22:44 compute-2 ceph-mon[77142]: pgmap v255: 305 pgs: 4 peering, 301 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:22:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:44.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:44 compute-2 sudo[88451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:44 compute-2 sudo[88451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:44 compute-2 sudo[88451]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:45 compute-2 sudo[88476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:22:45 compute-2 sudo[88476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:45 compute-2 sudo[88476]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:46 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Nov 29 06:22:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:46.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:46 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Nov 29 06:22:46 compute-2 ceph-mon[77142]: 3.2 scrub starts
Nov 29 06:22:46 compute-2 ceph-mon[77142]: 3.2 scrub ok
Nov 29 06:22:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:46 compute-2 ceph-mon[77142]: pgmap v256: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:22:46 compute-2 ceph-mon[77142]: osdmap e87: 3 total, 3 up, 3 in
Nov 29 06:22:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:22:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:22:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:46.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:47 compute-2 ceph-mon[77142]: Reconfiguring mon.compute-0 (monmap changed)...
Nov 29 06:22:47 compute-2 ceph-mon[77142]: Reconfiguring daemon mon.compute-0 on compute-0
Nov 29 06:22:47 compute-2 ceph-mon[77142]: pgmap v258: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:22:47 compute-2 ceph-mon[77142]: 11.8 scrub starts
Nov 29 06:22:47 compute-2 ceph-mon[77142]: 11.8 scrub ok
Nov 29 06:22:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.vxabpq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 06:22:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:22:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:48.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:22:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:48.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:22:49 compute-2 ceph-mon[77142]: 3.c scrub starts
Nov 29 06:22:49 compute-2 ceph-mon[77142]: 3.c scrub ok
Nov 29 06:22:49 compute-2 ceph-mon[77142]: Reconfiguring mgr.compute-0.vxabpq (monmap changed)...
Nov 29 06:22:49 compute-2 ceph-mon[77142]: Reconfiguring daemon mgr.compute-0.vxabpq on compute-0
Nov 29 06:22:49 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Nov 29 06:22:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Nov 29 06:22:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:50.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:50 compute-2 ceph-mon[77142]: pgmap v259: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:22:50 compute-2 ceph-mon[77142]: 5.19 scrub starts
Nov 29 06:22:50 compute-2 ceph-mon[77142]: 5.19 scrub ok
Nov 29 06:22:50 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Nov 29 06:22:50 compute-2 ceph-mon[77142]: osdmap e88: 3 total, 3 up, 3 in
Nov 29 06:22:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:50.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:51 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Nov 29 06:22:51 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Nov 29 06:22:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Nov 29 06:22:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:52 compute-2 ceph-mon[77142]: pgmap v261: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 25 B/s, 1 objects/s recovering
Nov 29 06:22:52 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Nov 29 06:22:52 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:52 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:52 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 06:22:52 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:52 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.b scrub starts
Nov 29 06:22:52 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.b scrub ok
Nov 29 06:22:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:52.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Nov 29 06:22:52 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367929459s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 active pruub 197.759460449s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:52 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367857933s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 197.759460449s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:52 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367554665s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 active pruub 197.759429932s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:52 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367480278s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 197.759429932s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:52.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:53 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Nov 29 06:22:53 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Nov 29 06:22:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:54.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:54.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:55 compute-2 ceph-mon[77142]: Reconfiguring crash.compute-0 (monmap changed)...
Nov 29 06:22:55 compute-2 ceph-mon[77142]: 10.11 scrub starts
Nov 29 06:22:55 compute-2 ceph-mon[77142]: Reconfiguring daemon crash.compute-0 on compute-0
Nov 29 06:22:55 compute-2 ceph-mon[77142]: 10.11 scrub ok
Nov 29 06:22:55 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Nov 29 06:22:55 compute-2 ceph-mon[77142]: osdmap e89: 3 total, 3 up, 3 in
Nov 29 06:22:55 compute-2 ceph-mon[77142]: pgmap v263: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:55 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 29 06:22:55 compute-2 ceph-mon[77142]: 8.b scrub starts
Nov 29 06:22:55 compute-2 ceph-mon[77142]: 8.b scrub ok
Nov 29 06:22:55 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 29 06:22:55 compute-2 ceph-mon[77142]: osdmap e90: 3 total, 3 up, 3 in
Nov 29 06:22:55 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Nov 29 06:22:55 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Nov 29 06:22:56 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Nov 29 06:22:56 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Nov 29 06:22:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:56.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:56.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:57 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Nov 29 06:22:57 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:57 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:57 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:57 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:57 compute-2 ceph-mon[77142]: 11.16 scrub starts
Nov 29 06:22:57 compute-2 ceph-mon[77142]: 11.16 scrub ok
Nov 29 06:22:57 compute-2 ceph-mon[77142]: pgmap v265: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 06:22:58 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Nov 29 06:22:58 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Nov 29 06:22:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:58.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:58 compute-2 sudo[88537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:58 compute-2 sudo[88537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:58 compute-2 sudo[88537]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:58 compute-2 sudo[88562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:58 compute-2 sudo[88562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:58 compute-2 sudo[88562]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:22:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:58.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:00.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:00.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Nov 29 06:23:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:02 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.15 deep-scrub starts
Nov 29 06:23:02 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.15 deep-scrub ok
Nov 29 06:23:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:02.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:02 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 92 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] async=[0] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:02 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 92 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] async=[0] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:02.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:03 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Nov 29 06:23:03 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Nov 29 06:23:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:04.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 3.3 scrub starts
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 3.3 scrub ok
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 10.3 scrub starts
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 5.a scrub starts
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 5.a scrub ok
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 10.3 scrub ok
Nov 29 06:23:04 compute-2 ceph-mon[77142]: pgmap v266: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 10.10 scrub starts
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 10.10 scrub ok
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 3.b scrub starts
Nov 29 06:23:04 compute-2 ceph-mon[77142]: 3.b scrub ok
Nov 29 06:23:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 06:23:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:04 compute-2 ceph-mon[77142]: osdmap e91: 3 total, 3 up, 3 in
Nov 29 06:23:04 compute-2 ceph-mon[77142]: pgmap v268: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:04 compute-2 ceph-mon[77142]: Reconfiguring osd.1 (monmap changed)...
Nov 29 06:23:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Nov 29 06:23:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:04 compute-2 ceph-mon[77142]: Reconfiguring daemon osd.1 on compute-0
Nov 29 06:23:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:04.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:06.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Nov 29 06:23:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.806956291s) [0] async=[0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 56'1130 active pruub 210.056961060s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.806859970s) [0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 210.056961060s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.809009552s) [0] async=[0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 56'1130 active pruub 210.059432983s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:06 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.808897972s) [0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 210.059432983s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:06.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:07 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Nov 29 06:23:07 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 10.1 scrub starts
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 10.1 scrub ok
Nov 29 06:23:07 compute-2 ceph-mon[77142]: pgmap v269: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 5.9 scrub starts
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 5.9 scrub ok
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 3.1f deep-scrub starts
Nov 29 06:23:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 06:23:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 3.1f deep-scrub ok
Nov 29 06:23:07 compute-2 ceph-mon[77142]: osdmap e92: 3 total, 3 up, 3 in
Nov 29 06:23:07 compute-2 ceph-mon[77142]: pgmap v271: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 8.15 deep-scrub starts
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 8.15 deep-scrub ok
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 3.f scrub starts
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 3.f scrub ok
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 8.5 scrub starts
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 8.5 scrub ok
Nov 29 06:23:07 compute-2 ceph-mon[77142]: pgmap v272: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:23:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:07 compute-2 ceph-mon[77142]: 8.1 scrub starts
Nov 29 06:23:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:08.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:08.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:09 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Nov 29 06:23:09 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Nov 29 06:23:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:10.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:10 compute-2 sudo[87619]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:10.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:12.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Nov 29 06:23:12 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=14.999697685s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 active pruub 218.826431274s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:12 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=14.999450684s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 218.826431274s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:12 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=15.610712051s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 active pruub 219.438018799s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:12 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=15.610674858s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 219.438018799s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:12.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:13 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.9 deep-scrub starts
Nov 29 06:23:13 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.9 deep-scrub ok
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 8.1 scrub ok
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 5.16 deep-scrub starts
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 5.16 deep-scrub ok
Nov 29 06:23:14 compute-2 ceph-mon[77142]: pgmap v273: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 119 B/s wr, 20 op/s; 76 B/s, 2 objects/s recovering
Nov 29 06:23:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:14 compute-2 ceph-mon[77142]: osdmap e93: 3 total, 3 up, 3 in
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 8.2 scrub starts
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 8.2 scrub ok
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 8.7 deep-scrub starts
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 8.7 deep-scrub ok
Nov 29 06:23:14 compute-2 ceph-mon[77142]: pgmap v275: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 127 B/s wr, 22 op/s; 82 B/s, 3 objects/s recovering
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 3.10 scrub starts
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 8.16 scrub starts
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 3.10 scrub ok
Nov 29 06:23:14 compute-2 ceph-mon[77142]: 8.16 scrub ok
Nov 29 06:23:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:14.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:14.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:15 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.a scrub starts
Nov 29 06:23:15 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.a scrub ok
Nov 29 06:23:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.a scrub starts
Nov 29 06:23:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.a scrub ok
Nov 29 06:23:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:16.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Nov 29 06:23:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:16.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:17 compute-2 ceph-mon[77142]: pgmap v276: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 0 B/s wr, 21 op/s; 80 B/s, 3 objects/s recovering
Nov 29 06:23:17 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:17 compute-2 ceph-mon[77142]: 10.6 scrub starts
Nov 29 06:23:17 compute-2 ceph-mon[77142]: 8.e scrub starts
Nov 29 06:23:17 compute-2 ceph-mon[77142]: 8.e scrub ok
Nov 29 06:23:17 compute-2 ceph-mon[77142]: 10.6 scrub ok
Nov 29 06:23:17 compute-2 ceph-mon[77142]: 8.13 scrub starts
Nov 29 06:23:17 compute-2 ceph-mon[77142]: pgmap v277: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 9.6 KiB/s rd, 0 B/s wr, 17 op/s; 32 B/s, 1 objects/s recovering
Nov 29 06:23:17 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:17 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:17 compute-2 ceph-mon[77142]: 8.13 scrub ok
Nov 29 06:23:17 compute-2 ceph-mon[77142]: osdmap e94: 3 total, 3 up, 3 in
Nov 29 06:23:18 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.f scrub starts
Nov 29 06:23:18 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.f scrub ok
Nov 29 06:23:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:18.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Nov 29 06:23:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:18 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:18 compute-2 sudo[88621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:18 compute-2 sudo[88621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:18 compute-2 sudo[88621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:18.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:19 compute-2 sudo[88646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:19 compute-2 sudo[88646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:19 compute-2 sudo[88646]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:19 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.d scrub starts
Nov 29 06:23:19 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.d scrub ok
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 8.9 deep-scrub starts
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 8.9 deep-scrub ok
Nov 29 06:23:19 compute-2 ceph-mon[77142]: pgmap v279: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 8.1a deep-scrub starts
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 8.1a deep-scrub ok
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 11.a scrub starts
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 11.a scrub ok
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 8.a scrub starts
Nov 29 06:23:19 compute-2 ceph-mon[77142]: pgmap v280: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 8.a scrub ok
Nov 29 06:23:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:19 compute-2 ceph-mon[77142]: osdmap e95: 3 total, 3 up, 3 in
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 10.7 scrub starts
Nov 29 06:23:19 compute-2 ceph-mon[77142]: 10.7 scrub ok
Nov 29 06:23:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 06:23:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:20.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:20.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:22 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Nov 29 06:23:22 compute-2 ceph-mon[77142]: pgmap v282: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:22 compute-2 ceph-mon[77142]: 8.f scrub starts
Nov 29 06:23:22 compute-2 ceph-mon[77142]: 8.f scrub ok
Nov 29 06:23:22 compute-2 ceph-mon[77142]: Reconfiguring crash.compute-1 (monmap changed)...
Nov 29 06:23:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:22 compute-2 ceph-mon[77142]: Reconfiguring daemon crash.compute-1 on compute-1
Nov 29 06:23:22 compute-2 ceph-mon[77142]: osdmap e96: 3 total, 3 up, 3 in
Nov 29 06:23:22 compute-2 ceph-mon[77142]: 8.d scrub starts
Nov 29 06:23:22 compute-2 ceph-mon[77142]: 8.d scrub ok
Nov 29 06:23:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:22 compute-2 sudo[88797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnnfgfpekqldbzhxbbivupyimrszhiou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397401.8878222-376-270537516295234/AnsiballZ_command.py'
Nov 29 06:23:22 compute-2 sudo[88797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:22.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:22 compute-2 python3.9[88799]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:23:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:22.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:23 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Nov 29 06:23:23 compute-2 sudo[88797]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:23 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Nov 29 06:23:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:23 compute-2 ceph-mon[77142]: 10.9 scrub starts
Nov 29 06:23:23 compute-2 ceph-mon[77142]: 10.9 scrub ok
Nov 29 06:23:23 compute-2 ceph-mon[77142]: pgmap v284: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 2 B/s, 0 objects/s recovering
Nov 29 06:23:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:23 compute-2 ceph-mon[77142]: Reconfiguring osd.0 (monmap changed)...
Nov 29 06:23:23 compute-2 ceph-mon[77142]: 8.1d scrub starts
Nov 29 06:23:23 compute-2 ceph-mon[77142]: 8.1d scrub ok
Nov 29 06:23:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Nov 29 06:23:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:23 compute-2 ceph-mon[77142]: Reconfiguring daemon osd.0 on compute-1
Nov 29 06:23:23 compute-2 ceph-mon[77142]: osdmap e97: 3 total, 3 up, 3 in
Nov 29 06:23:23 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 97 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] async=[0] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:23 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 97 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] async=[0] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:23 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.e scrub starts
Nov 29 06:23:23 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.e scrub ok
Nov 29 06:23:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:24.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:24 compute-2 sudo[89086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbfarczsblbgwavshpykobitjomcbeoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397403.8688483-401-185974569609471/AnsiballZ_selinux.py'
Nov 29 06:23:24 compute-2 sudo[89086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:24 compute-2 python3.9[89088]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 06:23:24 compute-2 sudo[89086]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:24.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:25 compute-2 ceph-mon[77142]: pgmap v286: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 2 B/s, 0 objects/s recovering
Nov 29 06:23:25 compute-2 ceph-mon[77142]: 8.1e scrub starts
Nov 29 06:23:25 compute-2 ceph-mon[77142]: 8.1e scrub ok
Nov 29 06:23:25 compute-2 ceph-mon[77142]: 11.19 scrub starts
Nov 29 06:23:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:25 compute-2 ceph-mon[77142]: pgmap v287: 305 pgs: 1 active+clean+scrubbing, 2 activating+remapped, 302 active+clean; 456 KiB data, 139 MiB used, 21 GiB / 21 GiB avail; 14 KiB/s rd, 296 B/s wr, 25 op/s; 12/214 objects misplaced (5.607%); 18 B/s, 1 objects/s recovering
Nov 29 06:23:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:25 compute-2 ceph-mon[77142]: Reconfiguring mon.compute-1 (monmap changed)...
Nov 29 06:23:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:23:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:23:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:25 compute-2 ceph-mon[77142]: Reconfiguring daemon mon.compute-1 on compute-1
Nov 29 06:23:25 compute-2 sudo[89238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cscpgdywjfmwzavmpgojzjdvfuhvgpgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397405.4812663-433-49732404025774/AnsiballZ_command.py'
Nov 29 06:23:25 compute-2 sudo[89238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:26 compute-2 python3.9[89240]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 06:23:26 compute-2 sudo[89238]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:26.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Nov 29 06:23:26 compute-2 sudo[89391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwirctelrgiozslliodkeldfpdylyuwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397406.3649812-458-94052258936130/AnsiballZ_file.py'
Nov 29 06:23:26 compute-2 sudo[89391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:26 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.176123619s) [0] async=[0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 56'1130 active pruub 231.373046875s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:26 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.171274185s) [0] async=[0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 56'1130 active pruub 231.368865967s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:26 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.171154976s) [0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 231.368865967s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:26 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.175400734s) [0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 231.373046875s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:26 compute-2 python3.9[89393]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:23:26 compute-2 sudo[89391]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:26.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:27 compute-2 sudo[89543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjzjcgzmuhfiubophkbfcnzewfepygaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397407.2436326-482-237438938802660/AnsiballZ_mount.py'
Nov 29 06:23:27 compute-2 sudo[89543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:27 compute-2 python3.9[89545]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 06:23:28 compute-2 sudo[89543]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:28.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:28.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:29 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Nov 29 06:23:29 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Nov 29 06:23:29 compute-2 sudo[89697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdkdtteifpjooacrozfbnajqkyoeuqkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397409.2294104-566-34652311829559/AnsiballZ_file.py'
Nov 29 06:23:29 compute-2 sudo[89697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:29 compute-2 python3.9[89699]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:29 compute-2 sudo[89697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:29 compute-2 ceph-mon[77142]: 11.19 scrub ok
Nov 29 06:23:29 compute-2 ceph-mon[77142]: 11.e scrub starts
Nov 29 06:23:29 compute-2 ceph-mon[77142]: 11.e scrub ok
Nov 29 06:23:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Nov 29 06:23:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:30.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:30 compute-2 sudo[89850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-useziwrayfxgruuodlmlovoquscyurhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397410.0335605-590-171389631671018/AnsiballZ_stat.py'
Nov 29 06:23:30 compute-2 sudo[89850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:30 compute-2 python3.9[89852]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:23:30 compute-2 ceph-mon[77142]: 10.a scrub starts
Nov 29 06:23:30 compute-2 ceph-mon[77142]: 10.a scrub ok
Nov 29 06:23:30 compute-2 ceph-mon[77142]: pgmap v288: 305 pgs: 1 active+clean+scrubbing, 2 activating+remapped, 302 active+clean; 456 KiB data, 139 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 255 B/s wr, 22 op/s; 12/214 objects misplaced (5.607%); 15 B/s, 1 objects/s recovering
Nov 29 06:23:30 compute-2 ceph-mon[77142]: osdmap e98: 3 total, 3 up, 3 in
Nov 29 06:23:30 compute-2 ceph-mon[77142]: pgmap v290: 305 pgs: 1 active+clean+scrubbing, 2 activating+remapped, 302 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 255 B/s wr, 22 op/s; 12/214 objects misplaced (5.607%); 27 B/s, 1 objects/s recovering
Nov 29 06:23:30 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:30 compute-2 ceph-mon[77142]: 9.1 scrub starts
Nov 29 06:23:30 compute-2 ceph-mon[77142]: 9.1 scrub ok
Nov 29 06:23:30 compute-2 sudo[89850]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:30 compute-2 sudo[89929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paeomshacrkigyjoxzoxomqryuslygbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397410.0335605-590-171389631671018/AnsiballZ_file.py'
Nov 29 06:23:30 compute-2 sudo[89929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:30.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:31 compute-2 python3.9[89931]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:23:31 compute-2 sudo[89929]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:32.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:32 compute-2 sudo[90081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtdhjjjvordxrpqzwrkbkcwjlpcvptpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397412.0159082-653-228528246179210/AnsiballZ_stat.py'
Nov 29 06:23:32 compute-2 sudo[90081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:32 compute-2 python3.9[90083]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:23:32 compute-2 sudo[90081]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:32 compute-2 sudo[90111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:32 compute-2 sudo[90111]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:32 compute-2 sudo[90111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:32 compute-2 sudo[90136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:32 compute-2 sudo[90136]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:32 compute-2 sudo[90136]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:32 compute-2 sudo[90161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:32 compute-2 sudo[90161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:32 compute-2 sudo[90161]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:32.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:32 compute-2 sudo[90186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:23:32 compute-2 sudo[90186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:33 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Nov 29 06:23:33 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Nov 29 06:23:33 compute-2 podman[90227]: 2025-11-29 06:23:33.277203305 +0000 UTC m=+0.063323011 container create 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:23:33 compute-2 systemd[1]: Started libpod-conmon-29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8.scope.
Nov 29 06:23:33 compute-2 podman[90227]: 2025-11-29 06:23:33.25173897 +0000 UTC m=+0.037858666 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:23:33 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:23:33 compute-2 podman[90227]: 2025-11-29 06:23:33.396717645 +0000 UTC m=+0.182837361 container init 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 06:23:33 compute-2 podman[90227]: 2025-11-29 06:23:33.403537446 +0000 UTC m=+0.189657122 container start 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:23:33 compute-2 podman[90227]: 2025-11-29 06:23:33.408480527 +0000 UTC m=+0.194600253 container attach 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 06:23:33 compute-2 nostalgic_feynman[90268]: 167 167
Nov 29 06:23:33 compute-2 systemd[1]: libpod-29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8.scope: Deactivated successfully.
Nov 29 06:23:33 compute-2 podman[90301]: 2025-11-29 06:23:33.46134193 +0000 UTC m=+0.031004544 container died 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:23:33 compute-2 systemd[1]: var-lib-containers-storage-overlay-e27a7cd7a6c72ac3bbe825aa7e204f1c284b773448ac3dc4a6199262f0f0a76a-merged.mount: Deactivated successfully.
Nov 29 06:23:33 compute-2 podman[90301]: 2025-11-29 06:23:33.507222197 +0000 UTC m=+0.076884811 container remove 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Nov 29 06:23:33 compute-2 systemd[1]: libpod-conmon-29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8.scope: Deactivated successfully.
Nov 29 06:23:33 compute-2 sudo[90186]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:33 compute-2 sudo[90391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brminezsrkmjeaxatwqfzfbfylczgtjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397413.3155143-692-247781250583084/AnsiballZ_getent.py'
Nov 29 06:23:33 compute-2 sudo[90391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:33 compute-2 python3.9[90393]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 06:23:34 compute-2 sudo[90391]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:34 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Nov 29 06:23:34 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Nov 29 06:23:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:34.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:34 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:34.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:35 compute-2 sudo[90545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrjlkpdjapbdewcfiepzulwbrgrxzlbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397414.709331-722-72901919283080/AnsiballZ_getent.py'
Nov 29 06:23:35 compute-2 sudo[90545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:35 compute-2 python3.9[90547]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 06:23:35 compute-2 sudo[90545]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:36.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:36.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:37 compute-2 sudo[90699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drazmxlfxthjwioruuilzrkspvbnsfgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397416.0751133-746-70348517539378/AnsiballZ_group.py'
Nov 29 06:23:37 compute-2 sudo[90699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:37 compute-2 python3.9[90701]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:23:37 compute-2 sudo[90699]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:37 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Nov 29 06:23:37 compute-2 ceph-mon[77142]: 8.11 scrub starts
Nov 29 06:23:37 compute-2 ceph-mon[77142]: 8.11 scrub ok
Nov 29 06:23:37 compute-2 ceph-mon[77142]: osdmap e99: 3 total, 3 up, 3 in
Nov 29 06:23:37 compute-2 ceph-mon[77142]: 10.b scrub starts
Nov 29 06:23:37 compute-2 ceph-mon[77142]: pgmap v292: 305 pgs: 2 activating+remapped, 303 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 255 B/s wr, 22 op/s; 12/214 objects misplaced (5.607%); 27 B/s, 1 objects/s recovering
Nov 29 06:23:37 compute-2 ceph-mon[77142]: 10.b scrub ok
Nov 29 06:23:37 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:37 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 06:23:38 compute-2 sudo[90800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:38 compute-2 sudo[90800]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-2 sudo[90800]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-2 sudo[90826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:38 compute-2 sudo[90826]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-2 sudo[90826]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-2 sudo[90875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:38 compute-2 sudo[90875]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-2 sudo[90875]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-2 sudo[90926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzppzoddiillpmeobenhbjqxhpjglxnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397417.90353-772-232849523118790/AnsiballZ_file.py'
Nov 29 06:23:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:38.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:38 compute-2 sudo[90926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:38 compute-2 sudo[90927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:23:38 compute-2 sudo[90927]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-2 ceph-mon[77142]: pgmap v293: 305 pgs: 305 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:23:38 compute-2 ceph-mon[77142]: Reconfiguring mon.compute-2 (monmap changed)...
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 9.2 scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 9.2 scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:38 compute-2 ceph-mon[77142]: Reconfiguring daemon mon.compute-2 on compute-2
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 8.3 scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.c scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 8.3 scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.c scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: pgmap v294: 305 pgs: 305 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.12 scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.12 scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.d scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.d scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 9.4 scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: pgmap v295: 305 pgs: 305 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 24 B/s, 1 objects/s recovering
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 9.c scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 9.c scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 9.4 scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 06:23:38 compute-2 ceph-mon[77142]: osdmap e100: 3 total, 3 up, 3 in
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:38 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.e scrub starts
Nov 29 06:23:38 compute-2 ceph-mon[77142]: 10.e scrub ok
Nov 29 06:23:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Nov 29 06:23:38 compute-2 python3.9[90935]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 06:23:38 compute-2 sudo[90926]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-2 podman[91049]: 2025-11-29 06:23:38.854343274 +0000 UTC m=+0.056193301 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:23:38 compute-2 podman[91049]: 2025-11-29 06:23:38.962325969 +0000 UTC m=+0.164175966 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Nov 29 06:23:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:39 compute-2 sudo[91098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:39 compute-2 sudo[91098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:39 compute-2 sudo[91098]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:39 compute-2 sudo[91141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:39 compute-2 sudo[91141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:39 compute-2 sudo[91141]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:39 compute-2 sudo[91388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvwpinoidaphxszhncskpkgluknirhhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397419.5470364-805-266115411640755/AnsiballZ_dnf.py'
Nov 29 06:23:39 compute-2 sudo[91388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:40 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Nov 29 06:23:40 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Nov 29 06:23:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:40.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:41 compute-2 python3.9[91390]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:23:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:41 compute-2 podman[91279]: 2025-11-29 06:23:41.507974835 +0000 UTC m=+1.905463955 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:23:41 compute-2 podman[91279]: 2025-11-29 06:23:41.515000311 +0000 UTC m=+1.912489401 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:23:41 compute-2 podman[91443]: 2025-11-29 06:23:41.717063541 +0000 UTC m=+0.058830561 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Nov 29 06:23:41 compute-2 podman[91463]: 2025-11-29 06:23:41.819987861 +0000 UTC m=+0.083049784 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, build-date=2023-02-22T09:23:20)
Nov 29 06:23:42 compute-2 podman[91443]: 2025-11-29 06:23:42.049168601 +0000 UTC m=+0.390935611 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, release=1793, io.buildah.version=1.28.2, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, vendor=Red Hat, Inc., version=2.2.4, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 29 06:23:42 compute-2 ceph-mon[77142]: pgmap v297: 305 pgs: 2 active+clean+scrubbing, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:23:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 06:23:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 06:23:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Nov 29 06:23:42 compute-2 ceph-mon[77142]: osdmap e101: 3 total, 3 up, 3 in
Nov 29 06:23:42 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Nov 29 06:23:42 compute-2 sudo[90927]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Nov 29 06:23:42 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Nov 29 06:23:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:42 compute-2 sudo[91476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:42 compute-2 sudo[91476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-2 sudo[91476]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-2 sudo[91502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:42 compute-2 sudo[91502]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-2 sudo[91502]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-2 sudo[91527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:42 compute-2 sudo[91527]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-2 sudo[91527]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-2 sudo[91552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:23:42 compute-2 sudo[91552]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Nov 29 06:23:42 compute-2 sudo[91552]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-2 sudo[91388]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:42.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:43 compute-2 ceph-mon[77142]: pgmap v299: 305 pgs: 2 active+clean+scrubbing, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 10.1e scrub starts
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 10.1e scrub ok
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 9.12 scrub starts
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 9.12 scrub ok
Nov 29 06:23:43 compute-2 ceph-mon[77142]: pgmap v300: 305 pgs: 1 unknown, 1 remapped+peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 11.13 scrub starts
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 11.13 scrub ok
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-2 ceph-mon[77142]: osdmap e102: 3 total, 3 up, 3 in
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 10.16 deep-scrub starts
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-2 ceph-mon[77142]: 10.16 deep-scrub ok
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Nov 29 06:23:43 compute-2 ceph-mon[77142]: osdmap e103: 3 total, 3 up, 3 in
Nov 29 06:23:43 compute-2 sudo[91757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goytjamcpdooncqzrivkfjvdptgwhtag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397423.1552541-830-178447588566930/AnsiballZ_file.py'
Nov 29 06:23:43 compute-2 sudo[91757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:43 compute-2 python3.9[91759]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:43 compute-2 sudo[91757]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:44 compute-2 sudo[91910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfjjzvtwbplorzowizlvpkzchqwybfiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397424.0826864-854-217973675378646/AnsiballZ_stat.py'
Nov 29 06:23:44 compute-2 sudo[91910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:44 compute-2 python3.9[91912]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:23:44 compute-2 sudo[91910]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:44 compute-2 sudo[91988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuvgkqudmevqnrepqoftfvmsnleomttc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397424.0826864-854-217973675378646/AnsiballZ_file.py'
Nov 29 06:23:44 compute-2 sudo[91988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:44.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:45 compute-2 python3.9[91990]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:45 compute-2 sudo[91988]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:46 compute-2 sudo[92140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynarwbwocdoclfzpghqwhzfmortrfshs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397425.913886-893-116747602631689/AnsiballZ_stat.py'
Nov 29 06:23:46 compute-2 sudo[92140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:46.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:46 compute-2 python3.9[92142]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:23:46 compute-2 sudo[92140]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:46 compute-2 sudo[92219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwmbgchxldiducpomudljegjorxchumf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397425.913886-893-116747602631689/AnsiballZ_file.py'
Nov 29 06:23:46 compute-2 sudo[92219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:46 compute-2 python3.9[92221]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:46 compute-2 sudo[92219]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:46.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:47 compute-2 sudo[92371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joergshbmonjogpeoasljnguzlgqqpng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397427.6049683-938-41258422353211/AnsiballZ_dnf.py'
Nov 29 06:23:47 compute-2 sudo[92371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Nov 29 06:23:48 compute-2 python3.9[92373]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:23:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:48.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:48 compute-2 ceph-mon[77142]: 9.14 scrub starts
Nov 29 06:23:48 compute-2 ceph-mon[77142]: 9.14 scrub ok
Nov 29 06:23:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:23:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:23:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:23:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:48 compute-2 ceph-mon[77142]: 10.17 deep-scrub starts
Nov 29 06:23:48 compute-2 ceph-mon[77142]: 10.17 deep-scrub ok
Nov 29 06:23:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:49.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:49 compute-2 sudo[92371]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Nov 29 06:23:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:51.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:51 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Nov 29 06:23:51 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Nov 29 06:23:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:52.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:52 compute-2 python3.9[92527]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:23:52 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Nov 29 06:23:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:53.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:53 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Nov 29 06:23:53 compute-2 python3.9[92679]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 06:23:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:54.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:55.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:55 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.c scrub starts
Nov 29 06:23:55 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.c scrub ok
Nov 29 06:23:55 compute-2 ceph-mon[77142]: pgmap v303: 305 pgs: 1 unknown, 1 remapped+peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 10.1a scrub starts
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 10.1a scrub ok
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 9.1c scrub starts
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 9.1c scrub ok
Nov 29 06:23:55 compute-2 ceph-mon[77142]: pgmap v304: 305 pgs: 1 unknown, 1 remapped+peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:55 compute-2 ceph-mon[77142]: osdmap e104: 3 total, 3 up, 3 in
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 11.2 scrub starts
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 11.2 scrub ok
Nov 29 06:23:55 compute-2 ceph-mon[77142]: pgmap v306: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 18 B/s, 1 objects/s recovering
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 10.1c scrub starts
Nov 29 06:23:55 compute-2 ceph-mon[77142]: 10.1c scrub ok
Nov 29 06:23:55 compute-2 python3.9[92830]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:23:56 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Nov 29 06:23:56 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Nov 29 06:23:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:56.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:56 compute-2 sudo[92981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykktlomgxvnqtnqlabepoieupyfoixb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397436.1591911-1061-91661741818672/AnsiballZ_systemd.py'
Nov 29 06:23:56 compute-2 sudo[92981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:57.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:57 compute-2 python3.9[92983]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:23:57 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 10.1d scrub starts
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 10.1d scrub ok
Nov 29 06:23:57 compute-2 ceph-mon[77142]: pgmap v307: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:23:57 compute-2 ceph-mon[77142]: osdmap e105: 3 total, 3 up, 3 in
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 10.1f scrub starts
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 10.1f scrub ok
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 8.1c scrub starts
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 8.1c scrub ok
Nov 29 06:23:57 compute-2 ceph-mon[77142]: pgmap v309: 305 pgs: 1 activating+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 4/219 objects misplaced (1.826%); 13 B/s, 0 objects/s recovering
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 11.6 scrub starts
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 8.1f scrub starts
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 8.1f scrub ok
Nov 29 06:23:57 compute-2 ceph-mon[77142]: pgmap v310: 305 pgs: 1 activating+remapped, 304 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 4/219 objects misplaced (1.826%); 13 B/s, 0 objects/s recovering
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 11.9 scrub starts
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 8.c scrub starts
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 8.c scrub ok
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 11.6 scrub ok
Nov 29 06:23:57 compute-2 ceph-mon[77142]: 11.9 scrub ok
Nov 29 06:23:57 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:57 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 06:23:57 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 06:23:57 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:23:57 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:23:57 compute-2 sudo[92981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:58.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:23:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:59.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:59 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.17 deep-scrub starts
Nov 29 06:23:59 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.17 deep-scrub ok
Nov 29 06:23:59 compute-2 sudo[93020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:59 compute-2 sudo[93020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:59 compute-2 sudo[93020]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:59 compute-2 sudo[93045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:59 compute-2 sudo[93045]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:59 compute-2 sudo[93045]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Nov 29 06:24:00 compute-2 python3.9[93195]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 06:24:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:00.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:01.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:01 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1b scrub starts
Nov 29 06:24:01 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1b scrub ok
Nov 29 06:24:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:02.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:03.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:03 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.7 deep-scrub starts
Nov 29 06:24:03 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.7 deep-scrub ok
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 11.17 scrub starts
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 11.17 scrub ok
Nov 29 06:24:03 compute-2 ceph-mon[77142]: pgmap v311: 305 pgs: 1 activating+remapped, 304 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 4/219 objects misplaced (1.826%)
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 8.10 scrub starts
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 8.10 scrub ok
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 11.b scrub starts
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 11.b scrub ok
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 11.14 scrub starts
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 11.14 scrub ok
Nov 29 06:24:03 compute-2 ceph-mon[77142]: 11.c scrub starts
Nov 29 06:24:04 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.b scrub starts
Nov 29 06:24:04 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.b scrub ok
Nov 29 06:24:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:04.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:04 compute-2 sudo[93348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxylozqxbnkthtvqpkusjbxaghcqesly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397444.2249-1232-274388375815803/AnsiballZ_systemd.py'
Nov 29 06:24:04 compute-2 sudo[93348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:04 compute-2 python3.9[93350]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:24:04 compute-2 sudo[93348]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:05.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Nov 29 06:24:05 compute-2 sudo[93502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqrrgfvtkkpaspxuxselbcrhbzjtkypi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397445.0642347-1232-101984689532581/AnsiballZ_systemd.py'
Nov 29 06:24:05 compute-2 sudo[93502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:05 compute-2 python3.9[93504]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:24:05 compute-2 sudo[93502]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:05 compute-2 ceph-mon[77142]: pgmap v312: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 11.c scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.17 deep-scrub starts
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.17 deep-scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 8.17 scrub starts
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 8.17 scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: pgmap v313: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:24:05 compute-2 ceph-mon[77142]: osdmap e106: 3 total, 3 up, 3 in
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 11.d scrub starts
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.1b scrub starts
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.1b scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: pgmap v315: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 11.d scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 11.10 scrub starts
Nov 29 06:24:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.7 deep-scrub starts
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.7 deep-scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 11.10 scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: pgmap v316: 305 pgs: 1 active+clean+scrubbing, 1 active+remapped, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.b scrub starts
Nov 29 06:24:05 compute-2 ceph-mon[77142]: 9.b scrub ok
Nov 29 06:24:05 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:06.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:07.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:07 compute-2 sshd-session[85502]: Connection closed by 192.168.122.30 port 35394
Nov 29 06:24:07 compute-2 sshd-session[85499]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:24:07 compute-2 systemd-logind[784]: Session 34 logged out. Waiting for processes to exit.
Nov 29 06:24:07 compute-2 systemd[1]: session-34.scope: Deactivated successfully.
Nov 29 06:24:07 compute-2 systemd[1]: session-34.scope: Consumed 1min 7.390s CPU time.
Nov 29 06:24:07 compute-2 systemd-logind[784]: Removed session 34.
Nov 29 06:24:07 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:08 compute-2 sudo[93532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:08 compute-2 sudo[93532]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:08 compute-2 sudo[93532]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:08 compute-2 sudo[93557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:24:08 compute-2 sudo[93557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:08 compute-2 sudo[93557]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:08.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:09.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:10.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Nov 29 06:24:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:10 compute-2 ceph-mon[77142]: 11.1e scrub starts
Nov 29 06:24:10 compute-2 ceph-mon[77142]: 11.1e scrub ok
Nov 29 06:24:10 compute-2 ceph-mon[77142]: 11.11 scrub starts
Nov 29 06:24:10 compute-2 ceph-mon[77142]: 11.11 scrub ok
Nov 29 06:24:10 compute-2 ceph-mon[77142]: osdmap e107: 3 total, 3 up, 3 in
Nov 29 06:24:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:24:10 compute-2 ceph-mon[77142]: pgmap v318: 305 pgs: 1 active+clean+scrubbing, 1 active+remapped, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 06:24:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:11.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:11 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Nov 29 06:24:11 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Nov 29 06:24:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:12.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:13.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:13 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Nov 29 06:24:13 compute-2 ceph-mon[77142]: 11.15 deep-scrub starts
Nov 29 06:24:13 compute-2 ceph-mon[77142]: 11.15 deep-scrub ok
Nov 29 06:24:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:24:13 compute-2 ceph-mon[77142]: pgmap v319: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 06:24:13 compute-2 ceph-mon[77142]: pgmap v320: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 06:24:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 06:24:13 compute-2 ceph-mon[77142]: 11.18 scrub starts
Nov 29 06:24:13 compute-2 ceph-mon[77142]: osdmap e108: 3 total, 3 up, 3 in
Nov 29 06:24:13 compute-2 ceph-mon[77142]: 11.7 deep-scrub starts
Nov 29 06:24:14 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Nov 29 06:24:14 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Nov 29 06:24:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:14.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:15.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Nov 29 06:24:16 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Nov 29 06:24:16 compute-2 sshd-session[93586]: Accepted publickey for zuul from 192.168.122.30 port 42760 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:24:16 compute-2 systemd-logind[784]: New session 35 of user zuul.
Nov 29 06:24:16 compute-2 systemd[1]: Started Session 35 of User zuul.
Nov 29 06:24:16 compute-2 sshd-session[93586]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:24:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:16.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:17 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Nov 29 06:24:17 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Nov 29 06:24:17 compute-2 python3.9[93740]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:18 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Nov 29 06:24:18 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Nov 29 06:24:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:18.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:18 compute-2 sudo[93895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eshukpmbokevxzhvvqmilbjxtkjxubqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397458.3802288-76-266395195235565/AnsiballZ_getent.py'
Nov 29 06:24:18 compute-2 sudo[93895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:19.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:19 compute-2 python3.9[93897]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 06:24:19 compute-2 sudo[93895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:19 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.19 scrub starts
Nov 29 06:24:19 compute-2 sudo[93923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:19 compute-2 sudo[93923]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:19 compute-2 sudo[93923]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:19 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.19 scrub ok
Nov 29 06:24:19 compute-2 sudo[93948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:19 compute-2 sudo[93948]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:19 compute-2 sudo[93948]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:19 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Nov 29 06:24:19 compute-2 ceph-mon[77142]: 11.18 scrub ok
Nov 29 06:24:19 compute-2 ceph-mon[77142]: 9.13 scrub starts
Nov 29 06:24:19 compute-2 ceph-mon[77142]: 9.13 scrub ok
Nov 29 06:24:19 compute-2 ceph-mon[77142]: 11.7 deep-scrub ok
Nov 29 06:24:19 compute-2 ceph-mon[77142]: pgmap v322: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 06:24:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 110 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=110 pruub=14.469883919s) [0] r=-1 lpr=110 pi=[77,110)/1 crt=56'1130 mlcod 0'0 active pruub 285.761932373s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 110 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=110 pruub=14.469782829s) [0] r=-1 lpr=110 pi=[77,110)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 285.761932373s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:20 compute-2 sudo[94098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zebmzfqaddbczlyclldbolnphlexvlrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397459.6922076-111-183116788958914/AnsiballZ_setup.py'
Nov 29 06:24:20 compute-2 sudo[94098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:20 compute-2 python3.9[94100]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:24:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:20.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:20 compute-2 sudo[94098]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:20 compute-2 sudo[94185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udnbesgrhftjqlnaschugjbtkotpxujl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397459.6922076-111-183116788958914/AnsiballZ_dnf.py'
Nov 29 06:24:20 compute-2 sudo[94185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Nov 29 06:24:21 compute-2 sshd-session[94110]: Invalid user ftpuser from 92.118.39.92 port 58428
Nov 29 06:24:21 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 111 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=0 lpr=111 pi=[77,111)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:21 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 111 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=0 lpr=111 pi=[77,111)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:21.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:21 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 111 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=111) [2] r=0 lpr=111 pi=[78,111)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:21 compute-2 sshd-session[94110]: Connection closed by invalid user ftpuser 92.118.39.92 port 58428 [preauth]
Nov 29 06:24:21 compute-2 python3.9[94187]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:24:22 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Nov 29 06:24:22 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Nov 29 06:24:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:22.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:23.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:23 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:24 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 06:24:24 compute-2 ceph-mon[77142]: osdmap e109: 3 total, 3 up, 3 in
Nov 29 06:24:24 compute-2 ceph-mon[77142]: pgmap v324: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.3 scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.3 scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 8.1b scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 8.1b scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 11.1f deep-scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: pgmap v325: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.15 scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.15 scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 10.8 deep-scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.5 scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.5 scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 11.1f deep-scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 10.8 deep-scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: pgmap v326: 305 pgs: 2 active+clean+scrubbing+deep, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.9 scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.9 scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 10.14 deep-scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.19 scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 8.4 deep-scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 8.4 deep-scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 9.19 scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 10.14 deep-scrub ok
Nov 29 06:24:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:24 compute-2 ceph-mon[77142]: osdmap e110: 3 total, 3 up, 3 in
Nov 29 06:24:24 compute-2 ceph-mon[77142]: pgmap v328: 305 pgs: 2 active+clean+scrubbing+deep, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 8.12 scrub starts
Nov 29 06:24:24 compute-2 ceph-mon[77142]: 8.12 scrub ok
Nov 29 06:24:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:24.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Nov 29 06:24:25 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 112 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=-1 lpr=112 pi=[78,112)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:25 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 112 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=-1 lpr=112 pi=[78,112)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:25.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:25 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 112 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=111/112 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] async=[0] r=0 lpr=111 pi=[77,111)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:25 compute-2 sudo[94185]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:26 compute-2 sudo[94340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygvpxtdhlwrsgvjdzfvhttjvitdkcliw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397465.7450736-154-159739659939536/AnsiballZ_dnf.py'
Nov 29 06:24:26 compute-2 sudo[94340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:26 compute-2 python3.9[94342]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:26.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:27.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:27 compute-2 sudo[94340]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:28.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:29.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 10.13 deep-scrub starts
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 10.13 deep-scrub ok
Nov 29 06:24:29 compute-2 ceph-mon[77142]: osdmap e111: 3 total, 3 up, 3 in
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 11.1d scrub starts
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 11.1d scrub ok
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 9.8 scrub starts
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 10.5 scrub starts
Nov 29 06:24:29 compute-2 ceph-mon[77142]: pgmap v330: 305 pgs: 2 active+clean+scrubbing+deep, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 9.8 scrub ok
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 10.5 scrub ok
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 8.8 scrub starts
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 8.8 scrub ok
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 9.18 scrub starts
Nov 29 06:24:29 compute-2 ceph-mon[77142]: pgmap v331: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 9.18 scrub ok
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 10.1b scrub starts
Nov 29 06:24:29 compute-2 ceph-mon[77142]: 10.1b scrub ok
Nov 29 06:24:29 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 06:24:29 compute-2 ceph-mon[77142]: osdmap e112: 3 total, 3 up, 3 in
Nov 29 06:24:29 compute-2 sudo[94495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxmphochbymidejvicvdgsjhqtwkghzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397469.2592957-178-249036368635953/AnsiballZ_systemd.py'
Nov 29 06:24:29 compute-2 sudo[94495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:30 compute-2 python3.9[94497]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:24:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:30.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Nov 29 06:24:30 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=111/112 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113 pruub=10.494091034s) [0] async=[0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 56'1130 active pruub 292.558746338s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:30 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=111/112 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113 pruub=10.493929863s) [0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 292.558746338s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 11.f scrub starts
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 11.f scrub ok
Nov 29 06:24:31 compute-2 ceph-mon[77142]: pgmap v333: 305 pgs: 1 active+clean+scrubbing, 1 unknown, 1 remapped+peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 10.18 scrub starts
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 10.18 scrub ok
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 10.2 scrub starts
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 10.2 scrub ok
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 8.14 scrub starts
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 8.14 scrub ok
Nov 29 06:24:31 compute-2 ceph-mon[77142]: pgmap v334: 305 pgs: 1 active+clean+scrubbing, 1 unknown, 1 remapped+peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 10.19 scrub starts
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 10.19 scrub ok
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 11.4 scrub starts
Nov 29 06:24:31 compute-2 ceph-mon[77142]: 11.4 scrub ok
Nov 29 06:24:31 compute-2 sudo[94495]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:32.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:32 compute-2 python3.9[94652]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:33.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:34 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:34.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:34 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Nov 29 06:24:34 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114) [2] r=0 lpr=114 pi=[78,114)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:34 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114) [2] r=0 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:34 compute-2 ceph-mon[77142]: pgmap v335: 305 pgs: 1 active+remapped, 1 unknown, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:34 compute-2 ceph-mon[77142]: 11.1a scrub starts
Nov 29 06:24:34 compute-2 ceph-mon[77142]: 11.1a scrub ok
Nov 29 06:24:34 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 06:24:34 compute-2 ceph-mon[77142]: osdmap e113: 3 total, 3 up, 3 in
Nov 29 06:24:34 compute-2 ceph-mon[77142]: 8.19 scrub starts
Nov 29 06:24:34 compute-2 sudo[94803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iethiremcaacxlxvvwrgzxhxrnttcyjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397474.2616875-232-220003916509448/AnsiballZ_sefcontext.py'
Nov 29 06:24:34 compute-2 sudo[94803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:35.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:35 compute-2 python3.9[94805]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 06:24:35 compute-2 sudo[94803]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 8.19 scrub ok
Nov 29 06:24:36 compute-2 ceph-mon[77142]: pgmap v337: 305 pgs: 1 active+remapped, 1 peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 10.15 scrub starts
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 10.15 scrub ok
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 11.1c scrub starts
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 11.5 scrub starts
Nov 29 06:24:36 compute-2 ceph-mon[77142]: pgmap v338: 305 pgs: 1 active+remapped, 1 peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:24:36 compute-2 ceph-mon[77142]: osdmap e114: 3 total, 3 up, 3 in
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 11.5 scrub ok
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 11.1c scrub ok
Nov 29 06:24:36 compute-2 ceph-mon[77142]: 11.1 deep-scrub starts
Nov 29 06:24:36 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 115 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=114/115 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114) [2] r=0 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:36.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:36 compute-2 python3.9[94955]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:37.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:37 compute-2 sudo[95112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idhbplygahznzifyqfttiqutgllnztdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397477.0616767-286-222453965819244/AnsiballZ_dnf.py'
Nov 29 06:24:37 compute-2 sudo[95112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:37 compute-2 python3.9[95114]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:37 compute-2 ceph-mon[77142]: 11.1 deep-scrub ok
Nov 29 06:24:37 compute-2 ceph-mon[77142]: osdmap e115: 3 total, 3 up, 3 in
Nov 29 06:24:37 compute-2 ceph-mon[77142]: pgmap v341: 305 pgs: 1 active+remapped, 1 peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 8.2 KiB/s rd, 170 B/s wr, 14 op/s; 36 B/s, 1 objects/s recovering
Nov 29 06:24:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:38.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:39.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:39 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:39 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Nov 29 06:24:39 compute-2 sudo[95112]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:39 compute-2 sudo[95141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:39 compute-2 sudo[95141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:39 compute-2 sudo[95141]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:39 compute-2 ceph-mon[77142]: pgmap v342: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 6.5 KiB/s rd, 0 B/s wr, 11 op/s; 29 B/s, 0 objects/s recovering
Nov 29 06:24:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Nov 29 06:24:39 compute-2 sudo[95166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:39 compute-2 sudo[95166]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:39 compute-2 sudo[95166]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:40 compute-2 sudo[95316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hffnkdeexdjolyjiuijwxydfldsqlzqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397479.7333665-309-123944574726798/AnsiballZ_command.py'
Nov 29 06:24:40 compute-2 sudo[95316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:40 compute-2 python3.9[95318]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:24:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:40.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Nov 29 06:24:40 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 117 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=117 pruub=12.623802185s) [1] r=-1 lpr=117 pi=[84,117)/1 crt=56'1130 mlcod 0'0 active pruub 304.739318848s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:40 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 117 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=117 pruub=12.623720169s) [1] r=-1 lpr=117 pi=[84,117)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 304.739318848s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:40 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Nov 29 06:24:40 compute-2 ceph-mon[77142]: osdmap e116: 3 total, 3 up, 3 in
Nov 29 06:24:40 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Nov 29 06:24:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:41.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:41 compute-2 sudo[95316]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:41 compute-2 ceph-mon[77142]: pgmap v344: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Nov 29 06:24:41 compute-2 ceph-mon[77142]: osdmap e117: 3 total, 3 up, 3 in
Nov 29 06:24:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Nov 29 06:24:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 118 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=118) [1]/[2] r=0 lpr=118 pi=[84,118)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:41 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 118 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=118) [1]/[2] r=0 lpr=118 pi=[84,118)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:42 compute-2 sudo[95604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mycqhdyuqjvpxavukykpdytagtajmjww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397481.5493338-333-122306129324802/AnsiballZ_file.py'
Nov 29 06:24:42 compute-2 sudo[95604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:42 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Nov 29 06:24:42 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Nov 29 06:24:42 compute-2 python3.9[95606]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:24:42 compute-2 sudo[95604]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:42.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:42 compute-2 ceph-mon[77142]: osdmap e118: 3 total, 3 up, 3 in
Nov 29 06:24:42 compute-2 ceph-mon[77142]: pgmap v347: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:42 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Nov 29 06:24:42 compute-2 ceph-mon[77142]: 9.16 scrub starts
Nov 29 06:24:42 compute-2 ceph-mon[77142]: 9.16 scrub ok
Nov 29 06:24:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:43.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:43 compute-2 python3.9[95757]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:24:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Nov 29 06:24:43 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 119 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=118/119 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=118) [1]/[2] async=[1] r=0 lpr=118 pi=[84,118)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:44 compute-2 ceph-mon[77142]: 11.12 scrub starts
Nov 29 06:24:44 compute-2 ceph-mon[77142]: 11.12 scrub ok
Nov 29 06:24:44 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Nov 29 06:24:44 compute-2 ceph-mon[77142]: osdmap e119: 3 total, 3 up, 3 in
Nov 29 06:24:44 compute-2 sudo[95909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmgntgccvqyfowmokosbvdbhhgpezbfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397483.9102566-381-58309867147789/AnsiballZ_dnf.py'
Nov 29 06:24:44 compute-2 sudo[95909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:44 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:44 compute-2 python3.9[95911]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:44 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Nov 29 06:24:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:45.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:45 compute-2 sudo[95909]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:46 compute-2 ceph-mon[77142]: pgmap v349: 305 pgs: 1 active+recovering+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 6/212 objects misplaced (2.830%)
Nov 29 06:24:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 06:24:46 compute-2 ceph-mon[77142]: osdmap e120: 3 total, 3 up, 3 in
Nov 29 06:24:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Nov 29 06:24:46 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=121 pruub=9.605250359s) [1] r=-1 lpr=121 pi=[71,121)/1 crt=56'1130 mlcod 0'0 active pruub 307.441101074s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:46 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=121 pruub=9.605175018s) [1] r=-1 lpr=121 pi=[71,121)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 307.441101074s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:46 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=118/119 n=7 ec=58/47 lis/c=118/84 les/c/f=119/85/0 sis=121 pruub=13.593140602s) [1] async=[1] r=-1 lpr=121 pi=[84,121)/1 crt=56'1130 mlcod 56'1130 active pruub 311.429443359s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:46 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=118/119 n=7 ec=58/47 lis/c=118/84 les/c/f=119/85/0 sis=121 pruub=13.593073845s) [1] r=-1 lpr=121 pi=[84,121)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 311.429443359s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:46.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:46 compute-2 sudo[96065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdnpjoqgwcvrkxugqwohuiohlpojwfqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397486.4523273-408-162727593982726/AnsiballZ_dnf.py'
Nov 29 06:24:46 compute-2 sudo[96065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:47.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:47 compute-2 python3.9[96067]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:48 compute-2 ceph-mon[77142]: pgmap v351: 305 pgs: 1 active+recovering+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 6/212 objects misplaced (2.830%)
Nov 29 06:24:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 06:24:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 06:24:48 compute-2 ceph-mon[77142]: osdmap e121: 3 total, 3 up, 3 in
Nov 29 06:24:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:48.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:48 compute-2 sudo[96065]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:49.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:50.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:51.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Nov 29 06:24:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 122 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=122) [1]/[2] r=0 lpr=122 pi=[71,122)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 122 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=122) [1]/[2] r=0 lpr=122 pi=[71,122)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:51 compute-2 ceph-mon[77142]: 11.1b scrub starts
Nov 29 06:24:51 compute-2 ceph-mon[77142]: 11.1b scrub ok
Nov 29 06:24:51 compute-2 ceph-mon[77142]: pgmap v353: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:24:51 compute-2 ceph-mon[77142]: 8.18 scrub starts
Nov 29 06:24:51 compute-2 ceph-mon[77142]: 8.18 scrub ok
Nov 29 06:24:51 compute-2 ceph-mon[77142]: pgmap v354: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:24:51 compute-2 ceph-mon[77142]: 9.e scrub starts
Nov 29 06:24:51 compute-2 ceph-mon[77142]: 9.e scrub ok
Nov 29 06:24:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:52.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:52 compute-2 sudo[96221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzpmnbwtoflkjuipsukafcfwmlhsigui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397492.182045-445-255757083400650/AnsiballZ_stat.py'
Nov 29 06:24:52 compute-2 sudo[96221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:52 compute-2 python3.9[96223]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:24:52 compute-2 sudo[96221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:53.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:53 compute-2 sudo[96375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkghatleqlbnvzxizhoesluiojckgsvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397492.9490654-469-66501941205355/AnsiballZ_slurp.py'
Nov 29 06:24:53 compute-2 sudo[96375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:53 compute-2 python3.9[96377]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Nov 29 06:24:53 compute-2 sudo[96375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:54.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Nov 29 06:24:54 compute-2 ceph-mon[77142]: 9.1e scrub starts
Nov 29 06:24:54 compute-2 ceph-mon[77142]: 9.1e scrub ok
Nov 29 06:24:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 06:24:54 compute-2 ceph-mon[77142]: osdmap e122: 3 total, 3 up, 3 in
Nov 29 06:24:54 compute-2 ceph-mon[77142]: pgmap v356: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 14 B/s, 0 objects/s recovering
Nov 29 06:24:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:55.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:55 compute-2 sshd-session[93590]: Connection closed by 192.168.122.30 port 42760
Nov 29 06:24:55 compute-2 sshd-session[93586]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:24:55 compute-2 systemd[1]: session-35.scope: Deactivated successfully.
Nov 29 06:24:55 compute-2 systemd[1]: session-35.scope: Consumed 19.758s CPU time.
Nov 29 06:24:55 compute-2 systemd-logind[784]: Session 35 logged out. Waiting for processes to exit.
Nov 29 06:24:55 compute-2 systemd-logind[784]: Removed session 35.
Nov 29 06:24:55 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 123 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=122/123 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=122) [1]/[2] async=[1] r=0 lpr=122 pi=[71,122)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:56.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:57.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:57 compute-2 ceph-mon[77142]: pgmap v357: 305 pgs: 1 peering, 1 unknown, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:24:57 compute-2 ceph-mon[77142]: osdmap e123: 3 total, 3 up, 3 in
Nov 29 06:24:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:58.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:24:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:24:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:59.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:24:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Nov 29 06:24:59 compute-2 ceph-mon[77142]: pgmap v359: 305 pgs: 1 peering, 1 unknown, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 124 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=122/123 n=2 ec=58/47 lis/c=122/71 les/c/f=123/72/0 sis=124 pruub=12.276672363s) [1] async=[1] r=-1 lpr=124 pi=[71,124)/1 crt=56'1130 mlcod 56'1130 active pruub 323.284393311s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:59 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 124 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=122/123 n=2 ec=58/47 lis/c=122/71 les/c/f=123/72/0 sis=124 pruub=12.276548386s) [1] r=-1 lpr=124 pi=[71,124)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 323.284393311s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:59 compute-2 sudo[96405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:59 compute-2 sudo[96405]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:59 compute-2 sudo[96405]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:59 compute-2 sudo[96430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:59 compute-2 sudo[96430]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:59 compute-2 sudo[96430]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:00.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:01.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Nov 29 06:25:02 compute-2 ceph-mon[77142]: 9.6 deep-scrub starts
Nov 29 06:25:02 compute-2 ceph-mon[77142]: 9.6 deep-scrub ok
Nov 29 06:25:02 compute-2 ceph-mon[77142]: pgmap v360: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:02.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:02 compute-2 sshd-session[96457]: Accepted publickey for zuul from 192.168.122.30 port 40368 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:25:02 compute-2 systemd-logind[784]: New session 36 of user zuul.
Nov 29 06:25:02 compute-2 systemd[1]: Started Session 36 of User zuul.
Nov 29 06:25:02 compute-2 sshd-session[96457]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:25:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:03.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:03 compute-2 python3.9[96610]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:04.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:05.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:05 compute-2 python3.9[96765]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:25:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Nov 29 06:25:06 compute-2 ceph-mon[77142]: 9.1a scrub starts
Nov 29 06:25:06 compute-2 ceph-mon[77142]: 9.1a scrub ok
Nov 29 06:25:06 compute-2 ceph-mon[77142]: pgmap v362: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:25:06 compute-2 ceph-mon[77142]: osdmap e124: 3 total, 3 up, 3 in
Nov 29 06:25:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 06:25:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 06:25:06 compute-2 ceph-mon[77142]: osdmap e125: 3 total, 3 up, 3 in
Nov 29 06:25:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 06:25:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:06.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:06 compute-2 python3.9[96959]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:25:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:07.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:07 compute-2 sshd-session[96460]: Connection closed by 192.168.122.30 port 40368
Nov 29 06:25:07 compute-2 sshd-session[96457]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:25:07 compute-2 systemd-logind[784]: Session 36 logged out. Waiting for processes to exit.
Nov 29 06:25:07 compute-2 systemd[1]: session-36.scope: Deactivated successfully.
Nov 29 06:25:07 compute-2 systemd[1]: session-36.scope: Consumed 2.256s CPU time.
Nov 29 06:25:07 compute-2 systemd-logind[784]: Removed session 36.
Nov 29 06:25:08 compute-2 sudo[96986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:08 compute-2 sudo[96986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-2 sudo[96986]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:08 compute-2 sudo[97011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:25:08 compute-2 sudo[97011]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-2 sudo[97011]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:08 compute-2 sudo[97036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:08 compute-2 sudo[97036]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-2 sudo[97036]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:08 compute-2 sudo[97062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:25:08 compute-2 sudo[97062]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:08.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:09.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:10.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:11.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:11 compute-2 ceph-mon[77142]: pgmap v363: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:25:11 compute-2 ceph-mon[77142]: pgmap v365: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:11 compute-2 ceph-mon[77142]: pgmap v366: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:11 compute-2 ceph-mon[77142]: 9.a scrub starts
Nov 29 06:25:11 compute-2 ceph-mon[77142]: 9.a scrub ok
Nov 29 06:25:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 06:25:11 compute-2 ceph-mon[77142]: osdmap e126: 3 total, 3 up, 3 in
Nov 29 06:25:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:12 compute-2 podman[97158]: 2025-11-29 06:25:12.158887875 +0000 UTC m=+3.279462817 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Nov 29 06:25:12 compute-2 podman[97158]: 2025-11-29 06:25:12.257000656 +0000 UTC m=+3.377575498 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:25:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:12.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:13.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:13 compute-2 podman[97316]: 2025-11-29 06:25:13.812095489 +0000 UTC m=+0.859998325 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:25:13 compute-2 podman[97316]: 2025-11-29 06:25:13.884346586 +0000 UTC m=+0.932249372 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:25:14 compute-2 podman[97380]: 2025-11-29 06:25:14.261041485 +0000 UTC m=+0.139702441 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1793, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9)
Nov 29 06:25:14 compute-2 podman[97400]: 2025-11-29 06:25:14.355048218 +0000 UTC m=+0.072977269 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, release=1793, vendor=Red Hat, Inc., vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, com.redhat.component=keepalived-container, distribution-scope=public, architecture=x86_64, io.openshift.tags=Ceph keepalived, name=keepalived, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2)
Nov 29 06:25:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:14.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:14 compute-2 podman[97380]: 2025-11-29 06:25:14.593301911 +0000 UTC m=+0.471962837 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, vendor=Red Hat, Inc., name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9)
Nov 29 06:25:14 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Nov 29 06:25:14 compute-2 ceph-mon[77142]: pgmap v368: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:14 compute-2 sudo[97062]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:15.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:15 compute-2 sshd-session[97414]: Accepted publickey for zuul from 192.168.122.30 port 50758 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:25:15 compute-2 systemd-logind[784]: New session 37 of user zuul.
Nov 29 06:25:15 compute-2 systemd[1]: Started Session 37 of User zuul.
Nov 29 06:25:15 compute-2 sshd-session[97414]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:25:15 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 127 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=127) [2] r=0 lpr=127 pi=[93,127)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:25:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Nov 29 06:25:16 compute-2 ceph-mon[77142]: pgmap v369: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:16 compute-2 ceph-mon[77142]: pgmap v370: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:16 compute-2 ceph-mon[77142]: pgmap v371: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:16 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:16 compute-2 ceph-mon[77142]: osdmap e127: 3 total, 3 up, 3 in
Nov 29 06:25:16 compute-2 python3.9[97567]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:16.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:17.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:17 compute-2 python3.9[97722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:18 compute-2 sudo[97876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agcstgidtihjrcomzmpfhhzsaxzbaklh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397518.0202456-88-133227580923353/AnsiballZ_setup.py'
Nov 29 06:25:18 compute-2 sudo[97876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:18.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:18 compute-2 python3.9[97878]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:25:18 compute-2 sudo[97876]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:19 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Nov 29 06:25:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:19 compute-2 ceph-mon[77142]: osdmap e128: 3 total, 3 up, 3 in
Nov 29 06:25:19 compute-2 ceph-mon[77142]: pgmap v374: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 06:25:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:19.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:19 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:25:19 compute-2 sudo[97961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jszmfbpanfknsyfdxlewuvqxxjwxzyjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397518.0202456-88-133227580923353/AnsiballZ_dnf.py'
Nov 29 06:25:19 compute-2 sudo[97961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:19 compute-2 python3.9[97963]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:25:19 compute-2 sudo[97964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:19 compute-2 sudo[97964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:19 compute-2 sudo[97964]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:19 compute-2 sudo[97990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:25:19 compute-2 sudo[97990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:19 compute-2 sudo[97990]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:19 compute-2 sudo[98015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:19 compute-2 sudo[98015]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:19 compute-2 sudo[98015]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:19 compute-2 sudo[98040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:25:19 compute-2 sudo[98040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:19 compute-2 sudo[98048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:19 compute-2 sudo[98048]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:19 compute-2 sudo[98048]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:20 compute-2 sudo[98090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:20 compute-2 sudo[98090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:20 compute-2 sudo[98090]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:20 compute-2 sudo[98040]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:20.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:21.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:22 compute-2 ceph-mon[77142]: 9.d scrub starts
Nov 29 06:25:22 compute-2 ceph-mon[77142]: 9.d scrub ok
Nov 29 06:25:22 compute-2 ceph-mon[77142]: 9.f scrub starts
Nov 29 06:25:22 compute-2 ceph-mon[77142]: 9.f scrub ok
Nov 29 06:25:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:22 compute-2 ceph-mon[77142]: pgmap v375: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 06:25:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 06:25:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:22 compute-2 ceph-mon[77142]: osdmap e129: 3 total, 3 up, 3 in
Nov 29 06:25:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:22 compute-2 sudo[97961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:22 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:22.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:22 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Nov 29 06:25:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:23.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:23 compute-2 sudo[98298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccfgpvyagmkvdyfbyzoqasjgcbjgxerg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397522.675422-123-18067336515695/AnsiballZ_setup.py'
Nov 29 06:25:23 compute-2 sudo[98298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:23 compute-2 python3.9[98300]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:25:23 compute-2 sudo[98298]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:24 compute-2 ceph-mon[77142]: pgmap v377: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:25:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:25:24 compute-2 ceph-mon[77142]: pgmap v378: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 06:25:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:24 compute-2 ceph-mon[77142]: osdmap e130: 3 total, 3 up, 3 in
Nov 29 06:25:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:25:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:25:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:25:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Nov 29 06:25:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:24.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:24 compute-2 sudo[98494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umkzduzdmklgywagposrmsowyskleoos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397524.466753-158-3919834487986/AnsiballZ_file.py'
Nov 29 06:25:24 compute-2 sudo[98494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:25.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Nov 29 06:25:25 compute-2 ceph-mon[77142]: osdmap e131: 3 total, 3 up, 3 in
Nov 29 06:25:25 compute-2 ceph-mon[77142]: pgmap v381: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:25 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:25 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:25:25 compute-2 python3.9[98496]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:25 compute-2 sudo[98494]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:25 compute-2 sudo[98646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htryfisktkpihizgvrczpiztnrkpisak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397525.5431604-180-215275482807428/AnsiballZ_command.py'
Nov 29 06:25:25 compute-2 sudo[98646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:26 compute-2 python3.9[98648]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:25:26 compute-2 sudo[98646]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:26.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Nov 29 06:25:26 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:25:26 compute-2 ceph-mon[77142]: osdmap e132: 3 total, 3 up, 3 in
Nov 29 06:25:26 compute-2 sudo[98812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yutimjalwhcjceaehxxyxllawpygbrfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397526.4874024-204-7605747001836/AnsiballZ_stat.py'
Nov 29 06:25:26 compute-2 sudo[98812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:27.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:27 compute-2 python3.9[98814]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:27 compute-2 sudo[98812]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:27 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:27 compute-2 sudo[98890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-triklkgtmhbqegyvbbfejdyzdhzmfjmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397526.4874024-204-7605747001836/AnsiballZ_file.py'
Nov 29 06:25:27 compute-2 sudo[98890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:27 compute-2 python3.9[98892]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:27 compute-2 sudo[98890]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:27 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Nov 29 06:25:28 compute-2 sudo[99042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xngsyrtbpzfizjolpekrblcbqmcjwssg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397527.9224732-240-111031955310544/AnsiballZ_stat.py'
Nov 29 06:25:28 compute-2 sudo[99042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:28 compute-2 ceph-mon[77142]: pgmap v383: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:28 compute-2 ceph-mon[77142]: osdmap e133: 3 total, 3 up, 3 in
Nov 29 06:25:28 compute-2 python3.9[99044]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:28 compute-2 sudo[99042]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:28.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:28 compute-2 sudo[99121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgvzgtxtilhkctsrgnlbooiqcrcpuzje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397527.9224732-240-111031955310544/AnsiballZ_file.py'
Nov 29 06:25:28 compute-2 sudo[99121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:28 compute-2 python3.9[99123]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:28 compute-2 sudo[99121]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:29.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:29 compute-2 sudo[99273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoggcpqixdzjwpfjjwihnmrxmoyctlhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397529.285257-279-83268084688241/AnsiballZ_ini_file.py'
Nov 29 06:25:29 compute-2 sudo[99273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:30 compute-2 python3.9[99275]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:30 compute-2 sudo[99273]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:30.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:30 compute-2 sudo[99426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdwkkzdbafdgdkqmcinbncmlnjxhwggy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397530.2020452-279-234827701128607/AnsiballZ_ini_file.py'
Nov 29 06:25:30 compute-2 sudo[99426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:30 compute-2 python3.9[99428]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:30 compute-2 sudo[99426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:31 compute-2 sudo[99578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiyjokszulagyvvbdpyhxxhmhmcfldsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397530.844693-279-191617310143432/AnsiballZ_ini_file.py'
Nov 29 06:25:31 compute-2 sudo[99578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:31.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:31 compute-2 python3.9[99580]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:31 compute-2 sudo[99578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:31 compute-2 sudo[99730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbrbxzgvxhdoygbhbpygqcjznywcztfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397531.4475725-279-40711362357352/AnsiballZ_ini_file.py'
Nov 29 06:25:31 compute-2 sudo[99730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:31 compute-2 python3.9[99732]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:31 compute-2 sudo[99730]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Nov 29 06:25:32 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Nov 29 06:25:32 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:32.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:33.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:33 compute-2 sudo[99883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saesjstotlxlcnsvmbtoptbkuwmgidsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397532.777622-372-76785931481789/AnsiballZ_dnf.py'
Nov 29 06:25:33 compute-2 sudo[99883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:33 compute-2 python3.9[99885]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:25:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:34.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:35 compute-2 sudo[99883]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:35.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:36.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:36 compute-2 sudo[100038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmdcqkwdkgxbnrxrzsjfavqchepwcmfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397536.473849-405-236419579523818/AnsiballZ_setup.py'
Nov 29 06:25:36 compute-2 sudo[100038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:36 compute-2 ceph-mon[77142]: osdmap e134: 3 total, 3 up, 3 in
Nov 29 06:25:36 compute-2 ceph-mon[77142]: pgmap v386: 305 pgs: 1 peering, 1 unknown, 303 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:37 compute-2 python3.9[100040]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:37 compute-2 sudo[100038]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:37.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:37 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:37 compute-2 sudo[100192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lobbehmnokjahezpoqgznvluqoqzznyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397537.4092753-430-216168252803602/AnsiballZ_stat.py'
Nov 29 06:25:37 compute-2 sudo[100192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:37 compute-2 python3.9[100194]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:25:37 compute-2 sudo[100192]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:38.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:38 compute-2 sudo[100345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exjfcjmqlrhnanyucdxvyolopwqwfqiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397538.3716905-458-209233196615075/AnsiballZ_stat.py'
Nov 29 06:25:38 compute-2 sudo[100345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:38 compute-2 python3.9[100347]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:25:38 compute-2 sudo[100345]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:39.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:39 compute-2 sudo[100497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwrykifynejokbhscpkdaptjomvurtzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397539.2112956-486-167335548099024/AnsiballZ_command.py'
Nov 29 06:25:39 compute-2 sudo[100497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:39 compute-2 python3.9[100499]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:25:39 compute-2 sudo[100497]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:40 compute-2 sudo[100561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:40 compute-2 sudo[100561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:40 compute-2 sudo[100561]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:40 compute-2 sudo[100602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:40 compute-2 sudo[100602]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:40 compute-2 sudo[100602]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:40 compute-2 sudo[100701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vglsbjlsmbrbfjjlruweiysranfizzqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397540.0876353-517-220194755694738/AnsiballZ_service_facts.py'
Nov 29 06:25:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:40 compute-2 sudo[100701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:40.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Nov 29 06:25:40 compute-2 python3.9[100703]: ansible-service_facts Invoked
Nov 29 06:25:40 compute-2 network[100720]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:25:40 compute-2 network[100721]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:25:40 compute-2 network[100722]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:25:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:41.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:41 compute-2 ceph-mon[77142]: pgmap v387: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 8.3 KiB/s rd, 170 B/s wr, 15 op/s; 109 B/s, 2 objects/s recovering
Nov 29 06:25:41 compute-2 ceph-mon[77142]: 9.1f deep-scrub starts
Nov 29 06:25:41 compute-2 ceph-mon[77142]: 9.1f deep-scrub ok
Nov 29 06:25:41 compute-2 ceph-mon[77142]: pgmap v388: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 62 B/s, 0 objects/s recovering
Nov 29 06:25:41 compute-2 ceph-mon[77142]: 9.10 scrub starts
Nov 29 06:25:41 compute-2 ceph-mon[77142]: 9.1d scrub starts
Nov 29 06:25:41 compute-2 ceph-mon[77142]: 9.1d scrub ok
Nov 29 06:25:41 compute-2 ceph-mon[77142]: 9.10 scrub ok
Nov 29 06:25:41 compute-2 ceph-mon[77142]: pgmap v389: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:42 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:42.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:43.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:43 compute-2 sudo[100701]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:45.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:46.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Nov 29 06:25:46 compute-2 ceph-mon[77142]: pgmap v390: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:46 compute-2 ceph-mon[77142]: 9.11 scrub starts
Nov 29 06:25:46 compute-2 ceph-mon[77142]: 9.11 scrub ok
Nov 29 06:25:46 compute-2 ceph-mon[77142]: pgmap v391: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:46 compute-2 ceph-mon[77142]: pgmap v392: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:46 compute-2 ceph-mon[77142]: osdmap e135: 3 total, 3 up, 3 in
Nov 29 06:25:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:47 compute-2 sudo[101008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwvztrygemhddayukseiwfbvvaecmebh ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764397546.731157-561-80201613183426/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764397546.731157-561-80201613183426/args'
Nov 29 06:25:47 compute-2 sudo[101008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:47.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:47 compute-2 sudo[101008]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:47 compute-2 sudo[101175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thnjziagelxxufjniffqsfrueazeyqcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397547.66347-595-6353207083647/AnsiballZ_dnf.py'
Nov 29 06:25:47 compute-2 sudo[101175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:48 compute-2 python3.9[101177]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:25:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:48.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:49.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:49 compute-2 sudo[101175]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Nov 29 06:25:51 compute-2 ceph-mon[77142]: pgmap v394: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:51 compute-2 ceph-mon[77142]: pgmap v395: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:52 compute-2 sudo[101330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-racebtdrooxbzqxfcambwlkemyeduliw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397551.6659198-634-187638349200447/AnsiballZ_package_facts.py'
Nov 29 06:25:52 compute-2 sudo[101330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:52.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:52 compute-2 python3.9[101332]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 06:25:52 compute-2 sudo[101330]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:53.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:53 compute-2 ceph-mon[77142]: pgmap v396: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-2 ceph-mon[77142]: osdmap e136: 3 total, 3 up, 3 in
Nov 29 06:25:53 compute-2 ceph-mon[77142]: pgmap v398: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-2 ceph-mon[77142]: pgmap v399: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-2 ceph-mon[77142]: osdmap e137: 3 total, 3 up, 3 in
Nov 29 06:25:53 compute-2 ceph-mon[77142]: pgmap v401: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:54 compute-2 sudo[101483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgxrwmpyzqcqexthfphaevtajcroevcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397553.7123828-665-67417698651821/AnsiballZ_stat.py'
Nov 29 06:25:54 compute-2 sudo[101483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:54 compute-2 python3.9[101485]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:54 compute-2 sudo[101483]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-2 sudo[101536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:54 compute-2 sudo[101536]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:54 compute-2 sudo[101536]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-2 sudo[101587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xppfeernvhknaenrghhbopknwocmokuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397553.7123828-665-67417698651821/AnsiballZ_file.py'
Nov 29 06:25:54 compute-2 sudo[101587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:54.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:54 compute-2 sudo[101588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:25:54 compute-2 sudo[101588]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:54 compute-2 sudo[101588]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-2 python3.9[101597]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:54 compute-2 sudo[101587]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:55.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:55 compute-2 sudo[101764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxrdeaowmlkpvnnxqarkiawinbgkvdrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397555.1124604-701-198274522249723/AnsiballZ_stat.py'
Nov 29 06:25:55 compute-2 sudo[101764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:55 compute-2 python3.9[101766]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:55 compute-2 sudo[101764]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:55 compute-2 sudo[101842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guenvocohmsqobiqmmttggwobhmhqxmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397555.1124604-701-198274522249723/AnsiballZ_file.py'
Nov 29 06:25:55 compute-2 sudo[101842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:56 compute-2 python3.9[101844]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:56 compute-2 sudo[101842]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:25:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:56.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:25:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:25:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:57.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:25:57 compute-2 sudo[101995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egzpsvcyuttdsrkzmulpgdgwyziectxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397557.3044705-755-174542461910022/AnsiballZ_lineinfile.py'
Nov 29 06:25:57 compute-2 sudo[101995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:57 compute-2 python3.9[101997]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:57 compute-2 sudo[101995]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:58.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:25:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:59.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:59 compute-2 sudo[102148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lygzzmyfudevrmakceozhqfobrgtpkqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397559.4364955-799-29880682942651/AnsiballZ_setup.py'
Nov 29 06:25:59 compute-2 sudo[102148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Nov 29 06:25:59 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:26:00 compute-2 python3.9[102150]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:26:00 compute-2 sudo[102148]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:00 compute-2 sudo[102159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:00 compute-2 sudo[102159]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:00 compute-2 sudo[102159]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:00 compute-2 sudo[102184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:00 compute-2 sudo[102184]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:00 compute-2 sudo[102184]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000055s ======
Nov 29 06:26:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Nov 29 06:26:00 compute-2 sudo[102283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeogesuosqlphvstpmrpbwhcchdtpryl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397559.4364955-799-29880682942651/AnsiballZ_systemd.py'
Nov 29 06:26:00 compute-2 sudo[102283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:01.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:01 compute-2 python3.9[102285]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:26:01 compute-2 sudo[102283]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:02.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:02 compute-2 sshd-session[97417]: Connection closed by 192.168.122.30 port 50758
Nov 29 06:26:02 compute-2 sshd-session[97414]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:26:02 compute-2 systemd[1]: session-37.scope: Deactivated successfully.
Nov 29 06:26:02 compute-2 systemd[1]: session-37.scope: Consumed 23.554s CPU time.
Nov 29 06:26:02 compute-2 systemd-logind[784]: Session 37 logged out. Waiting for processes to exit.
Nov 29 06:26:02 compute-2 systemd-logind[784]: Removed session 37.
Nov 29 06:26:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:03.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:04.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Nov 29 06:26:04 compute-2 ceph-mon[77142]: pgmap v402: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:04 compute-2 ceph-mon[77142]: pgmap v403: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:04 compute-2 ceph-mon[77142]: pgmap v404: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:26:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:05.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:07.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:08 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:08.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:08 compute-2 sshd-session[102316]: Accepted publickey for zuul from 192.168.122.30 port 53884 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:26:08 compute-2 systemd-logind[784]: New session 38 of user zuul.
Nov 29 06:26:08 compute-2 systemd[1]: Started Session 38 of User zuul.
Nov 29 06:26:08 compute-2 sshd-session[102316]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:26:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:09.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:09 compute-2 sudo[102469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qplvvhjdxxvurwwljyrsepoljgsudfir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397568.8684173-33-191891001871677/AnsiballZ_file.py'
Nov 29 06:26:09 compute-2 sudo[102469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:09 compute-2 python3.9[102471]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:09 compute-2 ceph-mon[77142]: pgmap v406: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 12 B/s, 0 objects/s recovering
Nov 29 06:26:09 compute-2 ceph-mon[77142]: osdmap e138: 3 total, 3 up, 3 in
Nov 29 06:26:09 compute-2 ceph-mon[77142]: pgmap v407: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:26:09 compute-2 ceph-mon[77142]: pgmap v408: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:09 compute-2 ceph-mon[77142]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:26:09 compute-2 sudo[102469]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:10 compute-2 sudo[102621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnimhxxvcenlesgxvylvxkadysitusdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397569.9388936-69-143431403295374/AnsiballZ_stat.py'
Nov 29 06:26:10 compute-2 sudo[102621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:10 compute-2 python3.9[102624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:10.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:10 compute-2 sudo[102621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:10 compute-2 sudo[102700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esplddbmyzfgralpzqazljctyafkzlxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397569.9388936-69-143431403295374/AnsiballZ_file.py'
Nov 29 06:26:10 compute-2 sudo[102700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:10 compute-2 ceph-mon[77142]: pgmap v410: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:10 compute-2 ceph-mon[77142]: pgmap v411: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:10 compute-2 python3.9[102702]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:11 compute-2 sudo[102700]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:11.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:11 compute-2 sshd-session[102319]: Connection closed by 192.168.122.30 port 53884
Nov 29 06:26:11 compute-2 sshd-session[102316]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:26:11 compute-2 systemd-logind[784]: Session 38 logged out. Waiting for processes to exit.
Nov 29 06:26:11 compute-2 systemd[1]: session-38.scope: Deactivated successfully.
Nov 29 06:26:11 compute-2 systemd[1]: session-38.scope: Consumed 1.434s CPU time.
Nov 29 06:26:11 compute-2 systemd-logind[784]: Removed session 38.
Nov 29 06:26:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:12.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:13.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:13 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:13 compute-2 ceph-mon[77142]: pgmap v412: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:14.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.662095) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574662170, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7900, "num_deletes": 256, "total_data_size": 16721993, "memory_usage": 16933328, "flush_reason": "Manual Compaction"}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574730915, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10246102, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 257, "largest_seqno": 7905, "table_properties": {"data_size": 10212502, "index_size": 22413, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10117, "raw_key_size": 96628, "raw_average_key_size": 24, "raw_value_size": 10133446, "raw_average_value_size": 2519, "num_data_blocks": 982, "num_entries": 4022, "num_filter_entries": 4022, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 1764397158, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 69148 microseconds, and 20658 cpu microseconds.
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.731246) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10246102 bytes OK
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.731336) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.732553) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.732570) EVENT_LOG_v1 {"time_micros": 1764397574732564, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.732592) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16679049, prev total WAL file size 16679684, number of live WAL files 2.
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.736486) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10005KB) 8(1648B)]
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574736648, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10247750, "oldest_snapshot_seqno": -1}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3769 keys, 10242318 bytes, temperature: kUnknown
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574808595, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10242318, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10209411, "index_size": 22365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 92412, "raw_average_key_size": 24, "raw_value_size": 10133504, "raw_average_value_size": 2688, "num_data_blocks": 982, "num_entries": 3769, "num_filter_entries": 3769, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.808946) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10242318 bytes
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.810573) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.3 rd, 142.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(9.8, 0.0 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 4027, records dropped: 258 output_compression: NoCompression
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.810604) EVENT_LOG_v1 {"time_micros": 1764397574810591, "job": 4, "event": "compaction_finished", "compaction_time_micros": 72031, "compaction_time_cpu_micros": 23599, "output_level": 6, "num_output_files": 1, "total_output_size": 10242318, "num_input_records": 4027, "num_output_records": 3769, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574813113, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574813183, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 29 06:26:14 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.736219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:26:15 compute-2 ceph-mon[77142]: pgmap v413: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:15.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:16.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:17.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:17 compute-2 sshd-session[102732]: Accepted publickey for zuul from 192.168.122.30 port 39118 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:26:17 compute-2 systemd-logind[784]: New session 39 of user zuul.
Nov 29 06:26:17 compute-2 systemd[1]: Started Session 39 of User zuul.
Nov 29 06:26:17 compute-2 sshd-session[102732]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:26:18 compute-2 ceph-mon[77142]: pgmap v414: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:26:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:18.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:18 compute-2 python3.9[102886]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:26:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:19.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:20 compute-2 sudo[103041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srqsikmkzffosszujnfnpmptbvbknqdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397579.9107597-66-185065785778020/AnsiballZ_file.py'
Nov 29 06:26:20 compute-2 sudo[103041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:20 compute-2 sudo[103044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:20 compute-2 sudo[103044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:20 compute-2 sudo[103044]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:20.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:20 compute-2 sudo[103069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:20 compute-2 sudo[103069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:20 compute-2 sudo[103069]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:20 compute-2 python3.9[103043]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:20 compute-2 sudo[103041]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:20 compute-2 ceph-mon[77142]: pgmap v415: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 9 B/s, 0 objects/s recovering
Nov 29 06:26:20 compute-2 ceph-mon[77142]: pgmap v416: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 9 B/s, 0 objects/s recovering
Nov 29 06:26:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:21.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:21 compute-2 sudo[103266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voddgtxmjgjzaofgkjkvqexopfbttntt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397580.9811645-90-215885594195213/AnsiballZ_stat.py'
Nov 29 06:26:21 compute-2 sudo[103266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:21 compute-2 python3.9[103268]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:21 compute-2 sudo[103266]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:22 compute-2 sudo[103344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzkevabfksxgnyxpfyeezjofwrmlwgdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397580.9811645-90-215885594195213/AnsiballZ_file.py'
Nov 29 06:26:22 compute-2 sudo[103344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:22 compute-2 python3.9[103346]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.spqdks13 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:22 compute-2 sudo[103344]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:22 compute-2 ceph-mon[77142]: pgmap v417: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:22.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:23.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:23 compute-2 sudo[103497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxlflageduyhdaebjuapkzivbwnyvpvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397583.2179341-151-103350210248500/AnsiballZ_stat.py'
Nov 29 06:26:23 compute-2 sudo[103497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:23 compute-2 python3.9[103499]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:23 compute-2 sudo[103497]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:23 compute-2 sudo[103575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngnjvqxosrukvdyvlhuczwkcyezppieb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397583.2179341-151-103350210248500/AnsiballZ_file.py'
Nov 29 06:26:23 compute-2 sudo[103575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:24 compute-2 python3.9[103577]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=._4aepc4w recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:24 compute-2 sudo[103575]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:24 compute-2 ceph-mon[77142]: pgmap v418: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:24.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:24 compute-2 sudo[103728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdlpivwzsvpxzsxjnrhrponuaykfkgjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397584.5217469-189-30157796495112/AnsiballZ_file.py'
Nov 29 06:26:24 compute-2 sudo[103728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:25 compute-2 python3.9[103730]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:25 compute-2 sudo[103728]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:25.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:25 compute-2 sudo[103880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtcloaxqoqzfexyjdecievzpucpafbpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397585.5077107-213-137055556982981/AnsiballZ_stat.py'
Nov 29 06:26:25 compute-2 sudo[103880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:26 compute-2 python3.9[103882]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:26 compute-2 sudo[103880]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:26 compute-2 sudo[103959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiemhihwiqkgoihppqdkppenhzhspola ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397585.5077107-213-137055556982981/AnsiballZ_file.py'
Nov 29 06:26:26 compute-2 sudo[103959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:26.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:26 compute-2 python3.9[103961]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:26 compute-2 sudo[103959]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:27 compute-2 sudo[104111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxjpwtbpewkzvvmgcbmjbdzywoprqlwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397586.762745-213-276021451123115/AnsiballZ_stat.py'
Nov 29 06:26:27 compute-2 sudo[104111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:27.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:27 compute-2 python3.9[104113]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:27 compute-2 sudo[104111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:27 compute-2 sudo[104191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khebvwswpxfgssdnnmtizacurtwnnqmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397586.762745-213-276021451123115/AnsiballZ_file.py'
Nov 29 06:26:27 compute-2 sudo[104191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:27 compute-2 python3.9[104193]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:27 compute-2 sudo[104191]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:27 compute-2 sshd-session[104129]: Invalid user debian from 92.118.39.92 port 51852
Nov 29 06:26:27 compute-2 sshd-session[104129]: Connection closed by invalid user debian 92.118.39.92 port 51852 [preauth]
Nov 29 06:26:28 compute-2 sudo[104344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxbigprkmendjzgdtjvrzergdkxcwokl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397588.094239-283-253729008192878/AnsiballZ_file.py'
Nov 29 06:26:28 compute-2 sudo[104344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:28.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:28 compute-2 python3.9[104346]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:28 compute-2 sudo[104344]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:29.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:29 compute-2 sudo[104496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sootodelyvmgrrdtqoqxocogiztctuvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397588.9433308-307-71385165986631/AnsiballZ_stat.py'
Nov 29 06:26:29 compute-2 sudo[104496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:29 compute-2 python3.9[104498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:29 compute-2 sudo[104496]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:29 compute-2 sudo[104574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfgnphvgsacxtaqyvflvjwfqtmstukcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397588.9433308-307-71385165986631/AnsiballZ_file.py'
Nov 29 06:26:29 compute-2 sudo[104574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:29 compute-2 python3.9[104576]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:29 compute-2 sudo[104574]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:30.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:30 compute-2 sudo[104727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zafutjhpbsalehfbufwzobqxikijlfie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397590.293874-342-154553934581036/AnsiballZ_stat.py'
Nov 29 06:26:30 compute-2 sudo[104727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:30 compute-2 python3.9[104729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:30 compute-2 ceph-mon[77142]: pgmap v419: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:30 compute-2 sudo[104727]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:31 compute-2 sudo[104805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enecrkiaibomwdxrqavwhozdyeskkokt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397590.293874-342-154553934581036/AnsiballZ_file.py'
Nov 29 06:26:31 compute-2 sudo[104805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:31.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:31 compute-2 python3.9[104807]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:31 compute-2 sudo[104805]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:32 compute-2 sudo[104957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxfrjmxhpvpxjsivbsxtradbuireizsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397591.522534-378-205265031953240/AnsiballZ_systemd.py'
Nov 29 06:26:32 compute-2 sudo[104957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:32 compute-2 python3.9[104959]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:26:32 compute-2 systemd[1]: Reloading.
Nov 29 06:26:32 compute-2 systemd-rc-local-generator[104988]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:26:32 compute-2 systemd-sysv-generator[104991]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:26:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:32.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:32 compute-2 sudo[104957]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:32 compute-2 ceph-mon[77142]: pgmap v420: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:32 compute-2 ceph-mon[77142]: pgmap v421: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:32 compute-2 ceph-mon[77142]: pgmap v422: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:33.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:33 compute-2 sudo[105147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsvqvenggsyzltkjqidkgguczqclnybj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397593.1248353-402-90234210081873/AnsiballZ_stat.py'
Nov 29 06:26:33 compute-2 sudo[105147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:33 compute-2 python3.9[105149]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:33 compute-2 sudo[105147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:34 compute-2 sudo[105225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arwkbzlkcnplmwatefnjnipwzsxiwuea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397593.1248353-402-90234210081873/AnsiballZ_file.py'
Nov 29 06:26:34 compute-2 sudo[105225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:34 compute-2 python3.9[105227]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:34 compute-2 sudo[105225]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:34.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:34 compute-2 sudo[105378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffularffxbvvadifgbkdxjhrbirfofjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397594.689203-439-201040166522108/AnsiballZ_stat.py'
Nov 29 06:26:34 compute-2 sudo[105378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:35.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:35 compute-2 python3.9[105380]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:35 compute-2 sudo[105378]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:35 compute-2 sudo[105456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlqnqhwtqxqugitozvogketxnizpqzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397594.689203-439-201040166522108/AnsiballZ_file.py'
Nov 29 06:26:35 compute-2 sudo[105456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:35 compute-2 python3.9[105458]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:35 compute-2 sudo[105456]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:36 compute-2 sudo[105608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtlilqxrdwtkuwfgzfkyzjrshduerlux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397595.924874-474-92234078672906/AnsiballZ_systemd.py'
Nov 29 06:26:36 compute-2 sudo[105608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:36 compute-2 python3.9[105610]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:26:36 compute-2 systemd[1]: Reloading.
Nov 29 06:26:36 compute-2 systemd-sysv-generator[105642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:26:36 compute-2 systemd-rc-local-generator[105635]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:26:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:36.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:36 compute-2 ceph-mon[77142]: pgmap v423: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:36 compute-2 systemd[1]: Starting Create netns directory...
Nov 29 06:26:36 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:26:36 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:26:36 compute-2 systemd[1]: Finished Create netns directory.
Nov 29 06:26:36 compute-2 sudo[105608]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:37.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:37 compute-2 python3.9[105802]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:26:37 compute-2 network[105819]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:26:37 compute-2 ceph-mon[77142]: pgmap v424: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:37 compute-2 ceph-mon[77142]: pgmap v425: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:37 compute-2 network[105820]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:26:37 compute-2 network[105821]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:26:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:38.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:39.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:40 compute-2 ceph-mon[77142]: pgmap v426: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:40.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:40 compute-2 sudo[105891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:40 compute-2 sudo[105891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:40 compute-2 sudo[105891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:40 compute-2 sudo[105920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:40 compute-2 sudo[105920]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:40 compute-2 sudo[105920]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:41.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:42.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:43.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:44 compute-2 sudo[106135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrirffiuhfuamansjorluwrazykvrfgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397604.178764-553-137840283687117/AnsiballZ_stat.py'
Nov 29 06:26:44 compute-2 sudo[106135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:44.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:45.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:46.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:47.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:48 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:26:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:48.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:49 compute-2 python3.9[106137]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:49.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:49 compute-2 sudo[106135]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:49 compute-2 sudo[106216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnlmlveynlvvwlslktydmvbgrkjfczoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397604.178764-553-137840283687117/AnsiballZ_file.py'
Nov 29 06:26:49 compute-2 sudo[106216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:49 compute-2 python3.9[106218]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:49 compute-2 sudo[106216]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:50 compute-2 sudo[106369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaynzlmhqmdnuybqzxlwcgihqomidnsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397610.1764238-591-196079987906189/AnsiballZ_file.py'
Nov 29 06:26:50 compute-2 sudo[106369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:26:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:50.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:26:50 compute-2 python3.9[106371]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:50 compute-2 sudo[106369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:51.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:51 compute-2 sudo[106521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jznhqiyzrbryvgfiebddebzvllaiyqmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397611.0234258-616-89887754882147/AnsiballZ_stat.py'
Nov 29 06:26:51 compute-2 sudo[106521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:51 compute-2 python3.9[106523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:51 compute-2 sudo[106521]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:51 compute-2 sudo[106599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqgpkmqqfbysaxzuoeaescgobkgbqlkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397611.0234258-616-89887754882147/AnsiballZ_file.py'
Nov 29 06:26:51 compute-2 sudo[106599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:52 compute-2 python3.9[106601]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:52 compute-2 sudo[106599]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:52 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 06:26:52 compute-2 ceph-mon[77142]: paxos.1).electionLogic(15) init, last seen epoch 15, mid-election, bumping
Nov 29 06:26:52 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:26:52 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:26:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:52.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:52 compute-2 ceph-mon[77142]: pgmap v427: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:26:53 compute-2 sudo[106752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baguhzsivjmfuxdqpnjwxtansrpmpzri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397612.7104979-661-86203761115899/AnsiballZ_timezone.py'
Nov 29 06:26:53 compute-2 sudo[106752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:53.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:53 compute-2 python3.9[106754]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 06:26:53 compute-2 systemd[1]: Starting Time & Date Service...
Nov 29 06:26:53 compute-2 systemd[1]: Started Time & Date Service.
Nov 29 06:26:53 compute-2 sudo[106752]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:54 compute-2 sudo[106908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vribfpthvqqprtrmohpqndifrdxoybee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397613.9973254-688-269636703453723/AnsiballZ_file.py'
Nov 29 06:26:54 compute-2 sudo[106908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:54 compute-2 python3.9[106910]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:54 compute-2 sudo[106908]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-2 sudo[106912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:54 compute-2 sudo[106912]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-2 sudo[106912]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:54.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:54 compute-2 sudo[106961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:26:54 compute-2 sudo[106961]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-2 sudo[106961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-2 sudo[106986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:54 compute-2 sudo[106986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-2 sudo[106986]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-2 sudo[107019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:26:54 compute-2 sudo[107019]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-2 ceph-mon[77142]: pgmap v428: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-2 ceph-mon[77142]: pgmap v429: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-2 ceph-mon[77142]: pgmap v430: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-2 ceph-mon[77142]: pgmap v431: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-2 ceph-mon[77142]: pgmap v432: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-2 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 06:26:54 compute-2 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 06:26:54 compute-2 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 06:26:54 compute-2 ceph-mon[77142]: pgmap v433: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-2 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:26:54 compute-2 ceph-mon[77142]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:26:54 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:26:54 compute-2 ceph-mon[77142]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:26:54 compute-2 ceph-mon[77142]: mgrmap e10: compute-0.vxabpq(active, since 9m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:26:54 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:26:55 compute-2 sudo[107191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxrbnfdbpffssddzhuqqiqxhabixqqii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397614.7579434-711-254622761602022/AnsiballZ_stat.py'
Nov 29 06:26:55 compute-2 sudo[107191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:55.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:55 compute-2 python3.9[107200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:55 compute-2 sudo[107191]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:55 compute-2 podman[107233]: 2025-11-29 06:26:55.275547334 +0000 UTC m=+0.087465020 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 06:26:55 compute-2 podman[107233]: 2025-11-29 06:26:55.393258009 +0000 UTC m=+0.205175675 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 06:26:55 compute-2 sudo[107354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfoycufaqjglbkzopeahpubfsprxnrqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397614.7579434-711-254622761602022/AnsiballZ_file.py'
Nov 29 06:26:55 compute-2 sudo[107354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:55 compute-2 python3.9[107359]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:55 compute-2 sudo[107354]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-2 podman[107489]: 2025-11-29 06:26:56.010305044 +0000 UTC m=+0.086074661 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:26:56 compute-2 podman[107536]: 2025-11-29 06:26:56.087996403 +0000 UTC m=+0.062074748 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:26:56 compute-2 ceph-mon[77142]: pgmap v434: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:56 compute-2 podman[107489]: 2025-11-29 06:26:56.16742006 +0000 UTC m=+0.243189647 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:26:56 compute-2 sudo[107652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izthgxrwvcvyskkgnvdlwbtohuyvvlia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397615.9936326-748-14587916809385/AnsiballZ_stat.py'
Nov 29 06:26:56 compute-2 sudo[107652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:56 compute-2 python3.9[107661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:56 compute-2 sudo[107652]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-2 podman[107683]: 2025-11-29 06:26:56.556191832 +0000 UTC m=+0.216098438 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4)
Nov 29 06:26:56 compute-2 podman[107683]: 2025-11-29 06:26:56.566941579 +0000 UTC m=+0.226848175 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, version=2.2.4, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, distribution-scope=public, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, release=1793, vendor=Red Hat, Inc.)
Nov 29 06:26:56 compute-2 sudo[107019]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:56.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:56 compute-2 sudo[107790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsulponwtrffqavcpphjhnnmsorvvgov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397615.9936326-748-14587916809385/AnsiballZ_file.py'
Nov 29 06:26:56 compute-2 sudo[107790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:56 compute-2 python3.9[107792]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.uon86pot recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:56 compute-2 sudo[107790]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:57.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:57 compute-2 sudo[107942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxlqhgfpwnababqociukefwocfwerpfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397617.1592243-783-53371493490685/AnsiballZ_stat.py'
Nov 29 06:26:57 compute-2 sudo[107942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:57 compute-2 python3.9[107944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:57 compute-2 sudo[107942]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:58 compute-2 sudo[108020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgkjdiodyxdjdhxoaeobufvqdgnnsdde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397617.1592243-783-53371493490685/AnsiballZ_file.py'
Nov 29 06:26:58 compute-2 sudo[108020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:58 compute-2 python3.9[108022]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:58 compute-2 sudo[108020]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:58.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:58 compute-2 ceph-mon[77142]: pgmap v435: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:59 compute-2 sudo[108173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfvaaqofkadbzkjfcxfvnhwtekvkdchi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397618.6620207-823-21829609013979/AnsiballZ_command.py'
Nov 29 06:26:59 compute-2 sudo[108173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:26:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 06:26:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:59.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 06:26:59 compute-2 python3.9[108175]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:26:59 compute-2 sudo[108173]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:59 compute-2 sudo[108177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:59 compute-2 sudo[108177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:59 compute-2 sudo[108177]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:59 compute-2 sudo[108226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:26:59 compute-2 sudo[108226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:59 compute-2 sudo[108226]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:59 compute-2 sudo[108251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:59 compute-2 sudo[108251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:59 compute-2 sudo[108251]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:59 compute-2 sudo[108277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:26:59 compute-2 sudo[108277]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:00 compute-2 sudo[108277]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-2 sudo[108458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpyboabfodulccppwzxrngrmzboukpoi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397619.5869803-847-67065043284191/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:27:00 compute-2 sudo[108458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:00 compute-2 python3[108460]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:27:00 compute-2 sudo[108458]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000039s ======
Nov 29 06:27:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:00.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000039s
Nov 29 06:27:00 compute-2 sudo[108611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hclipjzfbjshckpcmkweskctktgrqfbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397620.511934-871-23096991296213/AnsiballZ_stat.py'
Nov 29 06:27:00 compute-2 sudo[108611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:00 compute-2 sudo[108614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:00 compute-2 sudo[108614]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:00 compute-2 sudo[108614]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-2 sudo[108639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:00 compute-2 sudo[108639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:00 compute-2 sudo[108639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:01 compute-2 python3.9[108613]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:01 compute-2 sudo[108611]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:01.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:01 compute-2 sudo[108739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfoaggujrhzsjalgqizmzadeguwtpmzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397620.511934-871-23096991296213/AnsiballZ_file.py'
Nov 29 06:27:01 compute-2 sudo[108739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:01 compute-2 python3.9[108741]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:01 compute-2 sudo[108739]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:02 compute-2 sudo[108891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhmqjxbnfdvbakstxniotjcmjwdoblqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397621.782433-907-247211821343938/AnsiballZ_stat.py'
Nov 29 06:27:02 compute-2 sudo[108891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:02 compute-2 python3.9[108893]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:02 compute-2 sudo[108891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:02 compute-2 sudo[108970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohpzyrooiqyaanhbwfyhlzyrctjeysqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397621.782433-907-247211821343938/AnsiballZ_file.py'
Nov 29 06:27:02 compute-2 sudo[108970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:02 compute-2 python3.9[108972]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:02 compute-2 sudo[108970]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 06:27:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:03.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 06:27:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-2 ceph-mon[77142]: pgmap v436: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-2 sudo[109122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwdtcyyuzfkcihcilduikjmpgvsdgtyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397623.2621474-943-95990705208525/AnsiballZ_stat.py'
Nov 29 06:27:03 compute-2 sudo[109122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:03 compute-2 python3.9[109124]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:03 compute-2 sudo[109122]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:04 compute-2 sudo[109200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsukjpzpmuzralpaxveohuwbzeuqbvaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397623.2621474-943-95990705208525/AnsiballZ_file.py'
Nov 29 06:27:04 compute-2 sudo[109200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:04 compute-2 python3.9[109202]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:04 compute-2 sudo[109200]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:04.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:04 compute-2 sudo[109353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzguwavmfpltzmluqrkcpdwyoafrvmkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397624.6060853-979-138960839759525/AnsiballZ_stat.py'
Nov 29 06:27:04 compute-2 sudo[109353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:05 compute-2 python3.9[109355]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:05.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:05 compute-2 sudo[109353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:05 compute-2 sudo[109431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjuozijjmzpipcvntuqtnkordhkqbrnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397624.6060853-979-138960839759525/AnsiballZ_file.py'
Nov 29 06:27:05 compute-2 sudo[109431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:05 compute-2 python3.9[109433]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:05 compute-2 sudo[109431]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:06 compute-2 sudo[109584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbhzuwrmgkiqrbtvenlfyysqwatnydwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397626.1952045-1015-246442537812112/AnsiballZ_stat.py'
Nov 29 06:27:06 compute-2 sudo[109584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:06.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:06 compute-2 python3.9[109586]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:06 compute-2 sudo[109584]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:07 compute-2 sudo[109662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bocrntnwywecptetwyhwuondmxspelfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397626.1952045-1015-246442537812112/AnsiballZ_file.py'
Nov 29 06:27:07 compute-2 sudo[109662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:07.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:07 compute-2 python3.9[109664]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:07 compute-2 sudo[109662]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000039s ======
Nov 29 06:27:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:08.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000039s
Nov 29 06:27:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 06:27:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:09.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 06:27:09 compute-2 sudo[109815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqquxzqocejfmcctnrapsglcslgymsmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397629.2520268-1054-147959001913548/AnsiballZ_command.py'
Nov 29 06:27:09 compute-2 sudo[109815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:09 compute-2 python3.9[109817]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:09 compute-2 sudo[109815]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:10 compute-2 sudo[109971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdyxrsvsroellacpfkgthgarnhdjjqcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397630.0845687-1078-159913261029703/AnsiballZ_blockinfile.py'
Nov 29 06:27:10 compute-2 sudo[109971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:10.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:10 compute-2 python3.9[109973]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:10 compute-2 sudo[109971]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:11.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:11 compute-2 sudo[110123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmfrjzdihzmoiggkggxefkihnxuecngp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397631.0755064-1105-172344412325459/AnsiballZ_file.py'
Nov 29 06:27:11 compute-2 sudo[110123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:11 compute-2 python3.9[110125]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:11 compute-2 sudo[110123]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:11 compute-2 sudo[110275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrzpyydkcdcqnktdfwgscntrmqbcxur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397631.6994507-1105-31744437386727/AnsiballZ_file.py'
Nov 29 06:27:12 compute-2 sudo[110275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:12 compute-2 python3.9[110277]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:12 compute-2 ceph-mon[77142]: pgmap v437: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:12 compute-2 ceph-mon[77142]: pgmap v438: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:12 compute-2 sudo[110275]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:12.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:13 compute-2 sudo[110428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhfvmdkzqmxpozplfnmnvwgadppbxcis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397632.6535962-1150-113379077583982/AnsiballZ_mount.py'
Nov 29 06:27:13 compute-2 sudo[110428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:13.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:13 compute-2 python3.9[110430]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:27:13 compute-2 sudo[110428]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:13 compute-2 sudo[110580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eujrkyhcucojxkljxeadkhjrvpdixwxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397633.509376-1150-111585176762645/AnsiballZ_mount.py'
Nov 29 06:27:13 compute-2 sudo[110580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:13 compute-2 python3.9[110582]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:27:14 compute-2 sudo[110580]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:14 compute-2 ceph-mon[77142]: pgmap v439: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:27:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:27:14 compute-2 ceph-mon[77142]: pgmap v440: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-2 ceph-mon[77142]: pgmap v441: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-2 ceph-mon[77142]: pgmap v442: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-2 ceph-mon[77142]: pgmap v443: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:14 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:27:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 06:27:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:14.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 06:27:14 compute-2 sshd-session[102735]: Connection closed by 192.168.122.30 port 39118
Nov 29 06:27:14 compute-2 sshd-session[102732]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:27:14 compute-2 systemd[1]: session-39.scope: Deactivated successfully.
Nov 29 06:27:14 compute-2 systemd[1]: session-39.scope: Consumed 30.571s CPU time.
Nov 29 06:27:14 compute-2 systemd-logind[784]: Session 39 logged out. Waiting for processes to exit.
Nov 29 06:27:14 compute-2 systemd-logind[784]: Removed session 39.
Nov 29 06:27:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 06:27:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:15.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 06:27:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:16.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:17.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:18.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:27:18 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:27:18 compute-2 ceph-mon[77142]: pgmap v444: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:19.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:19 compute-2 ceph-mon[77142]: pgmap v445: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:19 compute-2 ceph-mon[77142]: pgmap v446: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:20.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:21 compute-2 sshd-session[110612]: Accepted publickey for zuul from 192.168.122.30 port 55860 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:27:21 compute-2 systemd-logind[784]: New session 40 of user zuul.
Nov 29 06:27:21 compute-2 systemd[1]: Started Session 40 of User zuul.
Nov 29 06:27:21 compute-2 sshd-session[110612]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:27:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:21.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:21 compute-2 ceph-mon[77142]: pgmap v447: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:21 compute-2 sudo[110765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixsnynbtzseubwnpxssgxldzlggszaba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397641.1273723-26-248525832284991/AnsiballZ_tempfile.py'
Nov 29 06:27:21 compute-2 sudo[110765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:21 compute-2 python3.9[110767]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 06:27:21 compute-2 sudo[110765]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-2 sudo[110792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:22 compute-2 sudo[110792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:22 compute-2 sudo[110792]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-2 sudo[110817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:22 compute-2 sudo[110817]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:22 compute-2 sudo[110817]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:22 compute-2 sudo[110894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:22 compute-2 sudo[110894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:22 compute-2 sudo[110894]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-2 sudo[110919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:27:22 compute-2 sudo[110919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:22 compute-2 sudo[110919]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-2 sudo[111018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsjjsfhbwsnleeprbhgdprisxwjuqiqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397642.1465192-61-137730535597024/AnsiballZ_stat.py'
Nov 29 06:27:22 compute-2 sudo[111018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:22.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:22 compute-2 python3.9[111020]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:27:22 compute-2 sudo[111018]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:23.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:23 compute-2 sudo[111172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyukbhcrchtxrjhyunisynqvaapjtpgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397643.0522044-85-9353337580042/AnsiballZ_slurp.py'
Nov 29 06:27:23 compute-2 sudo[111172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:23 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 06:27:23 compute-2 python3.9[111174]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 29 06:27:23 compute-2 sudo[111172]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:24 compute-2 sudo[111326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnzdetywxynbypnvlqcproryfgxxlhft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397643.9400768-110-76035653769870/AnsiballZ_stat.py'
Nov 29 06:27:24 compute-2 sudo[111326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:24 compute-2 ceph-mon[77142]: pgmap v448: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:24 compute-2 python3.9[111328]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.bamr9k_4 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:24 compute-2 sudo[111326]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:24.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:24 compute-2 sudo[111452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqylxcjpkediomozbexyrxgsmyzdcdva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397643.9400768-110-76035653769870/AnsiballZ_copy.py'
Nov 29 06:27:24 compute-2 sudo[111452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:25 compute-2 python3.9[111454]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.bamr9k_4 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397643.9400768-110-76035653769870/.source.bamr9k_4 _original_basename=.qaqv7d3h follow=False checksum=b291f010aefff8b88f41011b780271a83fd1182f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:25.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:25 compute-2 sudo[111452]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:26 compute-2 sudo[111604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhycggxqhjvoleefpfxdsxbxmavtfoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397645.468682-155-48920977428344/AnsiballZ_setup.py'
Nov 29 06:27:26 compute-2 sudo[111604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:26 compute-2 python3.9[111606]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:27:26 compute-2 sudo[111604]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:26.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:27.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:27 compute-2 sudo[111757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfyaqwkkvaarxgkwcjxqrytgiyiddipp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397646.8785062-179-81775256939643/AnsiballZ_blockinfile.py'
Nov 29 06:27:27 compute-2 sudo[111757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:27 compute-2 python3.9[111759]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=
                                              create=True mode=0644 path=/tmp/ansible.bamr9k_4 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:27 compute-2 sudo[111757]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:28 compute-2 sudo[111909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwjkmahkqjovhtaurxvpeoaylqupdhoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397647.7552948-204-62601038950265/AnsiballZ_command.py'
Nov 29 06:27:28 compute-2 sudo[111909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:28 compute-2 python3.9[111911]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.bamr9k_4' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:28 compute-2 sudo[111909]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:28.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:28 compute-2 ceph-mon[77142]: pgmap v449: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:29 compute-2 sudo[112064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiraqoiuckjeiumsebtbocswyddlzcgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397648.6699066-228-40730723831398/AnsiballZ_file.py'
Nov 29 06:27:29 compute-2 sudo[112064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:29.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:29 compute-2 python3.9[112066]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.bamr9k_4 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:29 compute-2 sudo[112064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:30 compute-2 sshd-session[110615]: Connection closed by 192.168.122.30 port 55860
Nov 29 06:27:30 compute-2 sshd-session[110612]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:27:30 compute-2 systemd[1]: session-40.scope: Deactivated successfully.
Nov 29 06:27:30 compute-2 systemd[1]: session-40.scope: Consumed 4.821s CPU time.
Nov 29 06:27:30 compute-2 systemd-logind[784]: Session 40 logged out. Waiting for processes to exit.
Nov 29 06:27:30 compute-2 systemd-logind[784]: Removed session 40.
Nov 29 06:27:30 compute-2 ceph-mon[77142]: pgmap v450: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:30 compute-2 ceph-mon[77142]: pgmap v451: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:30.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:31.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:31 compute-2 ceph-mon[77142]: pgmap v452: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:32.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:33.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:33 compute-2 ceph-mon[77142]: pgmap v453: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:33 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:34.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:34 compute-2 sshd-session[112094]: Accepted publickey for zuul from 192.168.122.30 port 49070 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:27:34 compute-2 systemd-logind[784]: New session 41 of user zuul.
Nov 29 06:27:34 compute-2 systemd[1]: Started Session 41 of User zuul.
Nov 29 06:27:35 compute-2 sshd-session[112094]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:27:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:35.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:35 compute-2 ceph-mon[77142]: pgmap v454: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:36 compute-2 python3.9[112247]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:27:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:36.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:37 compute-2 sudo[112402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzfwlcbbnjgfnaaeqtxnpopuecmbigks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397656.4824734-63-137206867824419/AnsiballZ_systemd.py'
Nov 29 06:27:37 compute-2 sudo[112402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:37.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:37 compute-2 python3.9[112404]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:27:37 compute-2 sudo[112402]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:38 compute-2 sudo[112556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbywbioroltxbexbrmlugnepzokkwlev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397657.8131745-88-179329273010097/AnsiballZ_systemd.py'
Nov 29 06:27:38 compute-2 sudo[112556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:38 compute-2 python3.9[112558]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:27:38 compute-2 sudo[112556]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:38 compute-2 ceph-mon[77142]: pgmap v455: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:38.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:39 compute-2 sudo[112710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-movldsircgreydaqaqsllgdidikncgez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397658.7568955-114-266307624747188/AnsiballZ_command.py'
Nov 29 06:27:39 compute-2 sudo[112710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:39.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:40 compute-2 python3.9[112712]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:40 compute-2 sudo[112710]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:40.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:41 compute-2 ceph-mon[77142]: pgmap v456: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:41 compute-2 sudo[112864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwcjpptrrjobnzcsdvysbryxzcamlzhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397660.7448847-139-183544647453790/AnsiballZ_stat.py'
Nov 29 06:27:41 compute-2 sudo[112864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:41.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:41 compute-2 python3.9[112866]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:27:41 compute-2 sudo[112864]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:42 compute-2 sudo[112968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:42 compute-2 sudo[112968]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:42 compute-2 sudo[112968]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:42 compute-2 sudo[113053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnstvtzucqsjguhksbgjjbutftvswmkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397661.7496917-165-252282594533576/AnsiballZ_file.py'
Nov 29 06:27:42 compute-2 sudo[113053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:42 compute-2 sudo[113031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:42 compute-2 sudo[113031]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:42 compute-2 sudo[113031]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:42 compute-2 python3.9[113066]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:42 compute-2 sudo[113053]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:42.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:42 compute-2 sshd-session[112097]: Connection closed by 192.168.122.30 port 49070
Nov 29 06:27:42 compute-2 sshd-session[112094]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:27:42 compute-2 systemd[1]: session-41.scope: Deactivated successfully.
Nov 29 06:27:42 compute-2 systemd[1]: session-41.scope: Consumed 3.983s CPU time.
Nov 29 06:27:42 compute-2 systemd-logind[784]: Session 41 logged out. Waiting for processes to exit.
Nov 29 06:27:42 compute-2 systemd-logind[784]: Removed session 41.
Nov 29 06:27:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:43.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:43 compute-2 ceph-mon[77142]: pgmap v457: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:44.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:45.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:46.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:47.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:47 compute-2 ceph-mon[77142]: pgmap v458: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:48 compute-2 sshd-session[113096]: Accepted publickey for zuul from 192.168.122.30 port 35070 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:27:48 compute-2 systemd-logind[784]: New session 42 of user zuul.
Nov 29 06:27:48 compute-2 systemd[1]: Started Session 42 of User zuul.
Nov 29 06:27:48 compute-2 sshd-session[113096]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:27:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:48.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:49 compute-2 python3.9[113250]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:27:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:49.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:49 compute-2 sudo[113404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anzytjsohueeefasktquzkntdldyhusa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397669.7045922-69-23712406295467/AnsiballZ_setup.py'
Nov 29 06:27:49 compute-2 sudo[113404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:50 compute-2 python3.9[113406]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:27:50 compute-2 sudo[113404]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:50.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:50 compute-2 sudo[113489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycisiweohjqjakhiisrztsyqdkfxlwhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397669.7045922-69-23712406295467/AnsiballZ_dnf.py'
Nov 29 06:27:50 compute-2 sudo[113489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:51.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:27:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:52.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:27:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:53.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:54.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:55.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:55 compute-2 python3.9[113491]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:27:56 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:27:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:56.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:57.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:58.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:27:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:59.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:00 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:28:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:00.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:01.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:01 compute-2 sudo[113489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:01 compute-2 anacron[30873]: Job `cron.daily' started
Nov 29 06:28:01 compute-2 anacron[30873]: Job `cron.daily' terminated
Nov 29 06:28:01 compute-2 ceph-mon[77142]: pgmap v459: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:01 compute-2 ceph-mon[77142]: pgmap v460: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:02 compute-2 sudo[113651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:02 compute-2 sudo[113651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:02 compute-2 sudo[113651]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:02 compute-2 python3.9[113650]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:28:02 compute-2 sudo[113677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:02 compute-2 sudo[113677]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:02 compute-2 sudo[113677]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:02.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).paxos(paxos updating c 252..820) lease_timeout -- calling new election
Nov 29 06:28:03 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 06:28:03 compute-2 ceph-mon[77142]: paxos.1).electionLogic(18) init, last seen epoch 18
Nov 29 06:28:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:03.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:03 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:28:03 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:28:03 compute-2 python3.9[113852]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:28:04 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy MDS connection to Monitors appears to be laggy; 15.7604s since last acked beacon
Nov 29 06:28:04 compute-2 ceph-mds[83861]: mds.0.4 skipping upkeep work because connection to Monitors appears laggy
Nov 29 06:28:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:28:04 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy  MDS is no longer laggy
Nov 29 06:28:04 compute-2 python3.9[114003]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:04.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:28:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:05.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:28:05 compute-2 python3.9[114153]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:06 compute-2 sshd-session[113099]: Connection closed by 192.168.122.30 port 35070
Nov 29 06:28:06 compute-2 sshd-session[113096]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:06 compute-2 systemd[1]: session-42.scope: Deactivated successfully.
Nov 29 06:28:06 compute-2 systemd[1]: session-42.scope: Consumed 6.193s CPU time.
Nov 29 06:28:06 compute-2 systemd-logind[784]: Session 42 logged out. Waiting for processes to exit.
Nov 29 06:28:06 compute-2 systemd-logind[784]: Removed session 42.
Nov 29 06:28:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:06.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:07.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:08 compute-2 ceph-mon[77142]: pgmap v467: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:08 compute-2 ceph-mon[77142]: pgmap v468: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:08 compute-2 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 06:28:08 compute-2 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 06:28:08 compute-2 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:28:08 compute-2 ceph-mon[77142]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:28:08 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:28:08 compute-2 ceph-mon[77142]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:28:08 compute-2 ceph-mon[77142]: mgrmap e10: compute-0.vxabpq(active, since 11m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:28:08 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:28:08 compute-2 ceph-mon[77142]: pgmap v469: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:08.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:09.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:09 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:09 compute-2 ceph-mon[77142]: pgmap v470: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:09 compute-2 ceph-mon[77142]: pgmap v471: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:10.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:11.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:11 compute-2 sshd-session[114181]: Accepted publickey for zuul from 192.168.122.30 port 47530 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:28:11 compute-2 systemd-logind[784]: New session 43 of user zuul.
Nov 29 06:28:11 compute-2 systemd[1]: Started Session 43 of User zuul.
Nov 29 06:28:11 compute-2 sshd-session[114181]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:12 compute-2 python3.9[114335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:28:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:12.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:12 compute-2 ceph-mon[77142]: pgmap v472: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:13.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:14 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:14 compute-2 sudo[114489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdftjrzlaeolissdqjcedrylenaajmog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397693.8847847-116-31441618049704/AnsiballZ_file.py'
Nov 29 06:28:14 compute-2 sudo[114489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:14 compute-2 python3.9[114492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:14 compute-2 sudo[114489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:14 compute-2 ceph-mon[77142]: pgmap v473: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:14.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:14 compute-2 sudo[114642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifernmdgdemnhxmybpuomerofwvoksal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397694.735363-116-133379998140831/AnsiballZ_file.py'
Nov 29 06:28:14 compute-2 sudo[114642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:15 compute-2 python3.9[114644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:15 compute-2 sudo[114642]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:15.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:15 compute-2 sudo[114794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqyqywgogemimieoeslofpcpanrtuxdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397695.4294388-160-227687559416352/AnsiballZ_stat.py'
Nov 29 06:28:15 compute-2 sudo[114794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:16 compute-2 python3.9[114796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:16 compute-2 sudo[114794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:16 compute-2 ceph-mon[77142]: pgmap v474: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:16 compute-2 sudo[114918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeajhfjfppwzwrezuutklbfjjqkmrvzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397695.4294388-160-227687559416352/AnsiballZ_copy.py'
Nov 29 06:28:16 compute-2 sudo[114918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:16.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:16 compute-2 python3.9[114920]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397695.4294388-160-227687559416352/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=14b9bbfa9929911e1123ed6fe048b8e915417748 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:16 compute-2 sudo[114918]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:17 compute-2 sudo[115070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeklxwsyiymxknkpxdjbiwcewllwxxii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397696.9737144-160-64058702470122/AnsiballZ_stat.py'
Nov 29 06:28:17 compute-2 sudo[115070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:17.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:17 compute-2 python3.9[115072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:17 compute-2 sudo[115070]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:17 compute-2 sudo[115193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiizzouyoaojqxkmbrkvhxjxvzyibfkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397696.9737144-160-64058702470122/AnsiballZ_copy.py'
Nov 29 06:28:17 compute-2 sudo[115193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:18 compute-2 python3.9[115195]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397696.9737144-160-64058702470122/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=03c2952c2692ca442730881904078ac3e566f340 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:18 compute-2 sudo[115193]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:18 compute-2 sudo[115346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvduqftbhcadxiyevncmobdlrddxbzsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397698.1790023-160-13165336933269/AnsiballZ_stat.py'
Nov 29 06:28:18 compute-2 sudo[115346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:18 compute-2 python3.9[115348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:18 compute-2 sudo[115346]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:18.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:18 compute-2 sudo[115469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gushipkzwdxtcgbhzwxvuobmbnxsuvqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397698.1790023-160-13165336933269/AnsiballZ_copy.py'
Nov 29 06:28:18 compute-2 sudo[115469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:19 compute-2 python3.9[115471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397698.1790023-160-13165336933269/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=be9a231ca8cb9d5c8a85bd82f4d8528bcb487e51 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:19 compute-2 sudo[115469]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:19 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:19.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:19 compute-2 sudo[115621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdqicfkiucylziprbsfyamuvmqegetlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397699.3923001-280-30332918776130/AnsiballZ_file.py'
Nov 29 06:28:19 compute-2 sudo[115621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:19 compute-2 python3.9[115623]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:19 compute-2 sudo[115621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:20 compute-2 sudo[115773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcdzqchcswnpmabezblzxkdlofvcwsci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397699.9709058-280-6636966802027/AnsiballZ_file.py'
Nov 29 06:28:20 compute-2 sudo[115773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:20 compute-2 python3.9[115775]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:20 compute-2 sudo[115773]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:20.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:21 compute-2 sudo[115926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywovthgpfpfbcjfogfslfxrnuqjsrutm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397700.6391497-322-189162828707543/AnsiballZ_stat.py'
Nov 29 06:28:21 compute-2 sudo[115926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:21.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:21 compute-2 python3.9[115928]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:21 compute-2 sudo[115926]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:21 compute-2 sudo[116049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfmdzwqmpxwuexsbaayviifmvmjhxuof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397700.6391497-322-189162828707543/AnsiballZ_copy.py'
Nov 29 06:28:21 compute-2 sudo[116049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:21 compute-2 python3.9[116051]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397700.6391497-322-189162828707543/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d83e9ef310607793aac5272a5dd3ed54e63fe338 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:21 compute-2 sudo[116049]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-2 sudo[116201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqstmpmxaevfyakzodasbglnyiagpciw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397702.0420837-322-120017904376708/AnsiballZ_stat.py'
Nov 29 06:28:22 compute-2 sudo[116201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:22 compute-2 sudo[116205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:22 compute-2 sudo[116205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-2 sudo[116205]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-2 python3.9[116203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:22 compute-2 sudo[116228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:22 compute-2 sudo[116228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-2 sudo[116228]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-2 sudo[116201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-2 sudo[116253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:28:22 compute-2 sudo[116253]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-2 sudo[116253]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-2 sudo[116278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:22 compute-2 sudo[116278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-2 sudo[116278]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-2 sudo[116298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:22 compute-2 sudo[116298]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-2 sudo[116298]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-2 sudo[116353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:28:22 compute-2 sudo[116353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:22.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:22 compute-2 sudo[116488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxedmzmramkcnqblgxbxearhgmobachl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397702.0420837-322-120017904376708/AnsiballZ_copy.py'
Nov 29 06:28:22 compute-2 sudo[116488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:23 compute-2 python3.9[116500]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397702.0420837-322-120017904376708/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:23 compute-2 sudo[116488]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:23 compute-2 podman[116550]: 2025-11-29 06:28:23.097658266 +0000 UTC m=+0.059937363 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:28:23 compute-2 podman[116550]: 2025-11-29 06:28:23.198479232 +0000 UTC m=+0.160758319 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:28:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:23.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:23 compute-2 ceph-mon[77142]: pgmap v475: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:23 compute-2 sudo[116773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyitedfjhqcanljdabhtkyceqswddmxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397703.154555-322-33864050510225/AnsiballZ_stat.py'
Nov 29 06:28:23 compute-2 sudo[116773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:23 compute-2 python3.9[116782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:23 compute-2 sudo[116773]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:23 compute-2 sudo[116982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivpviqtmygfcawzjsvxufutimjqshnwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397703.154555-322-33864050510225/AnsiballZ_copy.py'
Nov 29 06:28:23 compute-2 sudo[116982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:24 compute-2 podman[116871]: 2025-11-29 06:28:24.154354169 +0000 UTC m=+0.429044514 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:28:24 compute-2 podman[116871]: 2025-11-29 06:28:24.167275031 +0000 UTC m=+0.441965376 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:28:24 compute-2 python3.9[116984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397703.154555-322-33864050510225/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=5d5a903db1eca232a57ca76aa1a372ced69c51b8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:24 compute-2 sudo[116982]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:24 compute-2 podman[117037]: 2025-11-29 06:28:24.538998362 +0000 UTC m=+0.201944369 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, vendor=Red Hat, Inc., name=keepalived, release=1793, io.openshift.tags=Ceph keepalived, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, version=2.2.4, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.openshift.expose-services=)
Nov 29 06:28:24 compute-2 podman[117106]: 2025-11-29 06:28:24.635775738 +0000 UTC m=+0.074162731 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, description=keepalived for Ceph, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, name=keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-type=git)
Nov 29 06:28:24 compute-2 podman[117037]: 2025-11-29 06:28:24.692555174 +0000 UTC m=+0.355501181 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., architecture=x86_64, version=2.2.4, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, name=keepalived, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 06:28:24 compute-2 sudo[116353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:24.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:25 compute-2 sudo[117221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsaisqggcuxrvptkbhqnazopvxpkyelm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397704.5499349-430-239875019345631/AnsiballZ_file.py'
Nov 29 06:28:25 compute-2 sudo[117221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:25 compute-2 ceph-mon[77142]: pgmap v476: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:25 compute-2 ceph-mon[77142]: pgmap v477: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:25 compute-2 ceph-mon[77142]: pgmap v478: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:25 compute-2 python3.9[117223]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:25 compute-2 sudo[117221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:25.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:25 compute-2 sudo[117248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:25 compute-2 sudo[117248]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:25 compute-2 sudo[117248]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:25 compute-2 sudo[117296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:28:25 compute-2 sudo[117296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:25 compute-2 sudo[117296]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:25 compute-2 sudo[117348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:25 compute-2 sudo[117348]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:25 compute-2 sudo[117348]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:25 compute-2 sudo[117392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:28:25 compute-2 sudo[117392]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:25 compute-2 sudo[117473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdrbmplmokmunoznolvvfjvmufqczshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397705.422981-430-158552919190810/AnsiballZ_file.py'
Nov 29 06:28:25 compute-2 sudo[117473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:25 compute-2 python3.9[117475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:25 compute-2 sudo[117473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:25 compute-2 sudo[117392]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:26 compute-2 ceph-mon[77142]: pgmap v479: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:28:26 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:28:26 compute-2 sudo[117657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zirybuhhprnnawhnksirladcthtbrozy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397706.120503-475-106873351317592/AnsiballZ_stat.py'
Nov 29 06:28:26 compute-2 sudo[117657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:26 compute-2 python3.9[117659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:26 compute-2 sudo[117657]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:26.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:26 compute-2 sudo[117780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvwrxfcmwbcbicfzxieqgtivqrslwiwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397706.120503-475-106873351317592/AnsiballZ_copy.py'
Nov 29 06:28:26 compute-2 sudo[117780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:27 compute-2 python3.9[117782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397706.120503-475-106873351317592/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=4ccf2634f20abca04ee2090faa470941e7667ac5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:27 compute-2 sudo[117780]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:27.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:27 compute-2 sudo[117932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvtlikxwosoymijqdzriaqzcckdshrql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397707.288686-475-84753795137553/AnsiballZ_stat.py'
Nov 29 06:28:27 compute-2 sudo[117932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:27 compute-2 python3.9[117934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:27 compute-2 sudo[117932]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:28 compute-2 sudo[118055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtiykumbdybyqwtdcnfsxfnqbjrepgxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397707.288686-475-84753795137553/AnsiballZ_copy.py'
Nov 29 06:28:28 compute-2 sudo[118055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:28 compute-2 python3.9[118057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397707.288686-475-84753795137553/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:28 compute-2 sudo[118055]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:28.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:28 compute-2 sudo[118208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxunetnalbvcvsbmcxrilwiinbwhdcng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397708.4817088-475-251589301346846/AnsiballZ_stat.py'
Nov 29 06:28:28 compute-2 sudo[118208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:28 compute-2 python3.9[118210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:28 compute-2 sudo[118208]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:29 compute-2 ceph-mon[77142]: pgmap v480: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:29.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:29 compute-2 sudo[118331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgrunbawoyjhzfpqfjwkhgqilarsvgpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397708.4817088-475-251589301346846/AnsiballZ_copy.py'
Nov 29 06:28:29 compute-2 sudo[118331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:29 compute-2 python3.9[118333]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397708.4817088-475-251589301346846/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=e97a0024800c75a2251bda4519fe7a3e8494189f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:29 compute-2 sudo[118331]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:30 compute-2 sudo[118484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztepsoetxsuqbbxdlqgsykahedswvccq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397710.4444807-623-261011543133282/AnsiballZ_file.py'
Nov 29 06:28:30 compute-2 sudo[118484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:30.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:30 compute-2 python3.9[118486]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:30 compute-2 sudo[118484]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:31.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:31 compute-2 sudo[118636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liijnklzvcqkaksoaetsbrbkxcwnyujy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397711.0869498-645-50314244674820/AnsiballZ_stat.py'
Nov 29 06:28:31 compute-2 sudo[118636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:31 compute-2 python3.9[118638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:31 compute-2 sudo[118636]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:31 compute-2 ceph-mon[77142]: pgmap v481: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:31 compute-2 sudo[118759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krohhewjztdktntxuuldvlijbilbkhsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397711.0869498-645-50314244674820/AnsiballZ_copy.py'
Nov 29 06:28:31 compute-2 sudo[118759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:32 compute-2 python3.9[118761]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397711.0869498-645-50314244674820/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:32 compute-2 sudo[118759]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:32 compute-2 sudo[118912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvsqechkxyphesrqbekiyayvfkgzwkol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397712.4833586-696-30931208139522/AnsiballZ_file.py'
Nov 29 06:28:32 compute-2 sudo[118912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:32 compute-2 python3.9[118914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:32 compute-2 sudo[118912]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:33.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:33 compute-2 sudo[119064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjzlntzfhiexkcwvxrgkwldaniwjybsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397713.1532283-729-236043614670945/AnsiballZ_stat.py'
Nov 29 06:28:33 compute-2 sudo[119064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:33 compute-2 python3.9[119066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:33 compute-2 sudo[119064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:34 compute-2 sudo[119187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dovbombddrwrbdzcdgqeuxhxyvynucrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397713.1532283-729-236043614670945/AnsiballZ_copy.py'
Nov 29 06:28:34 compute-2 sudo[119187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:34 compute-2 ceph-mon[77142]: pgmap v482: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:34 compute-2 python3.9[119189]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397713.1532283-729-236043614670945/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:34 compute-2 sudo[119187]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:34 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:34 compute-2 sudo[119340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asgtqjwbazmroqnevuesidirkcckpzet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397714.532634-776-71700385376344/AnsiballZ_file.py'
Nov 29 06:28:34 compute-2 sudo[119340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:35 compute-2 python3.9[119342]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:35 compute-2 sudo[119340]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:35.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:35 compute-2 sudo[119492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqeshtktbazuigqfzphysgkxycumeraa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397715.219514-801-7912416078450/AnsiballZ_stat.py'
Nov 29 06:28:35 compute-2 sudo[119492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:35 compute-2 python3.9[119494]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:35 compute-2 sudo[119492]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:36 compute-2 sudo[119615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvyzrzpktvxaykqnsvvuzggddcjkuxov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397715.219514-801-7912416078450/AnsiballZ_copy.py'
Nov 29 06:28:36 compute-2 sudo[119615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:36 compute-2 ceph-mon[77142]: pgmap v483: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:36 compute-2 ceph-mon[77142]: pgmap v484: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:36 compute-2 python3.9[119617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397715.219514-801-7912416078450/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:36 compute-2 sudo[119615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:36 compute-2 sudo[119768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjuwmhtknbaxpsxamudfwglclymezpnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397716.4828517-843-30034776041049/AnsiballZ_file.py'
Nov 29 06:28:36 compute-2 sudo[119768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:36.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:36 compute-2 python3.9[119770]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:36 compute-2 sudo[119768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:37.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:37 compute-2 sudo[119920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehrpvzeuhzzhedcgbkonuzisccsdqgfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397717.0958455-868-180624767154835/AnsiballZ_stat.py'
Nov 29 06:28:37 compute-2 sudo[119920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:37 compute-2 python3.9[119922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:37 compute-2 sudo[119920]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:38 compute-2 sudo[120043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgdrhayvyxpyvciiqazrnsmgutdzketn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397717.0958455-868-180624767154835/AnsiballZ_copy.py'
Nov 29 06:28:38 compute-2 sudo[120043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:38 compute-2 python3.9[120045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397717.0958455-868-180624767154835/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:38 compute-2 sudo[120043]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:38 compute-2 sudo[120196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqrhoypzuazwdtdctzgyyhnqsutuxmfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397718.476113-915-280719157974104/AnsiballZ_file.py'
Nov 29 06:28:38 compute-2 sudo[120196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:38.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:39 compute-2 python3.9[120198]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:39 compute-2 sudo[120196]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:39 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:39.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:39 compute-2 ceph-mon[77142]: pgmap v485: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:39 compute-2 sudo[120348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tclnqijytevatqbigjsmomdjwvoluydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397719.410827-938-74257077186553/AnsiballZ_stat.py'
Nov 29 06:28:39 compute-2 sudo[120348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:39 compute-2 python3.9[120350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:39 compute-2 sudo[120348]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:40 compute-2 sshd-session[120351]: Invalid user linux from 92.118.39.92 port 45280
Nov 29 06:28:40 compute-2 sudo[120473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyzgvzeihyigkxumkqpgudigevjvxmhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397719.410827-938-74257077186553/AnsiballZ_copy.py'
Nov 29 06:28:40 compute-2 sudo[120473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:40 compute-2 sshd-session[120351]: Connection closed by invalid user linux 92.118.39.92 port 45280 [preauth]
Nov 29 06:28:40 compute-2 python3.9[120475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397719.410827-938-74257077186553/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:40 compute-2 sudo[120473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:40 compute-2 ceph-mon[77142]: pgmap v486: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:41 compute-2 sudo[120626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cippgiekudotbjoezqhgewuxyvlcklss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397720.7288392-985-154693472387825/AnsiballZ_file.py'
Nov 29 06:28:41 compute-2 sudo[120626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:41 compute-2 python3.9[120628]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:41 compute-2 sudo[120626]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:41.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:41 compute-2 sudo[120778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrkrmdwnmblnyexdqimpjivmaklxcipi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397721.3963447-1010-175397569019269/AnsiballZ_stat.py'
Nov 29 06:28:41 compute-2 sudo[120778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:41 compute-2 ceph-mon[77142]: pgmap v487: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:41 compute-2 python3.9[120780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:41 compute-2 sudo[120778]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-2 sudo[120781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:42 compute-2 sudo[120781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:42 compute-2 sudo[120781]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-2 sudo[120829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:28:42 compute-2 sudo[120829]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:42 compute-2 sudo[120829]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-2 sudo[120951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bylyszbuowghvcwqmueygkejkwbtnmgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397721.3963447-1010-175397569019269/AnsiballZ_copy.py'
Nov 29 06:28:42 compute-2 sudo[120951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:42 compute-2 python3.9[120953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397721.3963447-1010-175397569019269/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:42 compute-2 sudo[120951]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-2 sudo[120961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:42 compute-2 sudo[120961]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:42 compute-2 sudo[120961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-2 sudo[121004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:42 compute-2 sudo[121004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:42 compute-2 sudo[121004]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:28:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:42.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:28:43 compute-2 ceph-mon[77142]: pgmap v488: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:43.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:44 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:44.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.005000136s ======
Nov 29 06:28:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:45.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000136s
Nov 29 06:28:46 compute-2 ceph-mon[77142]: pgmap v489: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:46.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:47 compute-2 sshd-session[114184]: Connection closed by 192.168.122.30 port 47530
Nov 29 06:28:47 compute-2 sshd-session[114181]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:47 compute-2 systemd[1]: session-43.scope: Deactivated successfully.
Nov 29 06:28:47 compute-2 systemd[1]: session-43.scope: Consumed 22.510s CPU time.
Nov 29 06:28:47 compute-2 systemd-logind[784]: Session 43 logged out. Waiting for processes to exit.
Nov 29 06:28:47 compute-2 systemd-logind[784]: Removed session 43.
Nov 29 06:28:47 compute-2 ceph-mon[77142]: pgmap v490: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:47.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:28:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:48.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:28:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:49.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:50 compute-2 ceph-mon[77142]: pgmap v491: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:50.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:51.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:52 compute-2 sshd-session[121034]: Accepted publickey for zuul from 192.168.122.30 port 33930 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:28:52 compute-2 systemd-logind[784]: New session 44 of user zuul.
Nov 29 06:28:52 compute-2 systemd[1]: Started Session 44 of User zuul.
Nov 29 06:28:52 compute-2 sshd-session[121034]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:52.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:53 compute-2 ceph-mon[77142]: pgmap v492: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:53.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:53 compute-2 sudo[121187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgeomshkermfpfhpkrmjamjroyxdvsso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397732.8063548-34-75394885629921/AnsiballZ_file.py'
Nov 29 06:28:53 compute-2 sudo[121187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:53 compute-2 python3.9[121189]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:53 compute-2 sudo[121187]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:54 compute-2 ceph-mon[77142]: pgmap v493: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:54 compute-2 sudo[121339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdzsqkiayrmqrnwopiygircwpftywimw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397733.781479-69-25501589853031/AnsiballZ_stat.py'
Nov 29 06:28:54 compute-2 sudo[121339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:54 compute-2 python3.9[121341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:54 compute-2 sudo[121339]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:54.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:54 compute-2 sudo[121463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlkchnsjwylipdufmqwebprnegxttsum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397733.781479-69-25501589853031/AnsiballZ_copy.py'
Nov 29 06:28:54 compute-2 sudo[121463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:55 compute-2 python3.9[121465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397733.781479-69-25501589853031/.source.conf _original_basename=ceph.conf follow=False checksum=b678e866ce48244e104f356f74865d3398155ff0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:55 compute-2 sudo[121463]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:55.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:55 compute-2 sudo[121615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bllbdaqhypcustwxsfwxwvzebeocbhly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397735.3108172-69-208818788610847/AnsiballZ_stat.py'
Nov 29 06:28:55 compute-2 sudo[121615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:55 compute-2 python3.9[121617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:55 compute-2 sudo[121615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:56 compute-2 sudo[121738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcysrhcdhevukovjppdgavrjnbmozoqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397735.3108172-69-208818788610847/AnsiballZ_copy.py'
Nov 29 06:28:56 compute-2 sudo[121738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:56 compute-2 ceph-mon[77142]: pgmap v494: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:56 compute-2 python3.9[121740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397735.3108172-69-208818788610847/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=d5bc1b1c0617b147c8e3e13846b179249a244079 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:56 compute-2 sudo[121738]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.748428) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736748481, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1351, "num_deletes": 253, "total_data_size": 3239363, "memory_usage": 3272648, "flush_reason": "Manual Compaction"}
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736761353, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1349524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7910, "largest_seqno": 9256, "table_properties": {"data_size": 1344764, "index_size": 2156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12194, "raw_average_key_size": 20, "raw_value_size": 1334447, "raw_average_value_size": 2246, "num_data_blocks": 99, "num_entries": 594, "num_filter_entries": 594, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397574, "oldest_key_time": 1764397574, "file_creation_time": 1764397736, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 12980 microseconds, and 4308 cpu microseconds.
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.761406) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1349524 bytes OK
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.761433) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.764326) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.764356) EVENT_LOG_v1 {"time_micros": 1764397736764347, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.764382) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 3232965, prev total WAL file size 3232965, number of live WAL files 2.
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.765227) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323534' seq:0, type:0; will stop at (end)
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1317KB)], [15(10002KB)]
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736765267, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11591842, "oldest_snapshot_seqno": -1}
Nov 29 06:28:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:56.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3885 keys, 9449033 bytes, temperature: kUnknown
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736827915, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9449033, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9417566, "index_size": 20669, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9733, "raw_key_size": 95386, "raw_average_key_size": 24, "raw_value_size": 9341653, "raw_average_value_size": 2404, "num_data_blocks": 911, "num_entries": 3885, "num_filter_entries": 3885, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764397736, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.828215) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9449033 bytes
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.829578) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.8 rd, 150.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 9.8 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(15.6) write-amplify(7.0) OK, records in: 4363, records dropped: 478 output_compression: NoCompression
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.829599) EVENT_LOG_v1 {"time_micros": 1764397736829588, "job": 6, "event": "compaction_finished", "compaction_time_micros": 62741, "compaction_time_cpu_micros": 25673, "output_level": 6, "num_output_files": 1, "total_output_size": 9449033, "num_input_records": 4363, "num_output_records": 3885, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736829925, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736831844, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.765156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:56 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:57 compute-2 sshd-session[121037]: Connection closed by 192.168.122.30 port 33930
Nov 29 06:28:57 compute-2 sshd-session[121034]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:57 compute-2 systemd[1]: session-44.scope: Deactivated successfully.
Nov 29 06:28:57 compute-2 systemd[1]: session-44.scope: Consumed 2.584s CPU time.
Nov 29 06:28:57 compute-2 systemd-logind[784]: Session 44 logged out. Waiting for processes to exit.
Nov 29 06:28:57 compute-2 systemd-logind[784]: Removed session 44.
Nov 29 06:28:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:57.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:58 compute-2 ceph-mon[77142]: pgmap v495: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:58.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:28:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:28:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:59.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:29:00 compute-2 ceph-mon[77142]: pgmap v496: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:00.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:01.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:02 compute-2 ceph-mon[77142]: pgmap v497: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:02 compute-2 sudo[121769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:02 compute-2 sudo[121769]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:02 compute-2 sudo[121769]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:02 compute-2 sudo[121796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:02 compute-2 sudo[121796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:02.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:02 compute-2 sudo[121796]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:02 compute-2 sshd-session[121776]: Accepted publickey for zuul from 192.168.122.30 port 49290 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:29:02 compute-2 systemd-logind[784]: New session 45 of user zuul.
Nov 29 06:29:02 compute-2 systemd[1]: Started Session 45 of User zuul.
Nov 29 06:29:02 compute-2 sshd-session[121776]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:29:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:03.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:03 compute-2 python3.9[121972]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:04 compute-2 ceph-mon[77142]: pgmap v498: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:04.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:05 compute-2 sudo[122127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnplsmwwtbqgzojmsgrfxtbipxkdfdmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397744.6924887-70-18494871461171/AnsiballZ_file.py'
Nov 29 06:29:05 compute-2 sudo[122127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:05.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:05 compute-2 python3.9[122129]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:05 compute-2 sudo[122127]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:05 compute-2 sudo[122279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwkkrxbnupnsuijalcyrxidqptwkpjll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397745.702947-70-141234573772997/AnsiballZ_file.py'
Nov 29 06:29:05 compute-2 sudo[122279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:06 compute-2 python3.9[122281]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:06 compute-2 sudo[122279]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:06.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:07 compute-2 python3.9[122432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:07 compute-2 ceph-mon[77142]: pgmap v499: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:07.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:07 compute-2 sudo[122582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gapygrurwqkcbqnwavvgbqxiilyboeox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397747.3458369-138-195486482674150/AnsiballZ_seboolean.py'
Nov 29 06:29:07 compute-2 sudo[122582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:08 compute-2 python3.9[122584]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 06:29:08 compute-2 ceph-mon[77142]: pgmap v500: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:08.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:09 compute-2 sudo[122582]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:09.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:09 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:10.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:11.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:12.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:13 compute-2 ceph-mon[77142]: pgmap v501: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:13.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:13 compute-2 sudo[122741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykyglftfmwejvzmwtmkofurpefphfyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397753.427247-168-53359879802396/AnsiballZ_setup.py'
Nov 29 06:29:13 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 29 06:29:13 compute-2 sudo[122741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:14 compute-2 python3.9[122743]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:29:14 compute-2 sudo[122741]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:14 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:14 compute-2 sudo[122826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzweaqnrwhgtlrxilnigzaayruldekhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397753.427247-168-53359879802396/AnsiballZ_dnf.py'
Nov 29 06:29:14 compute-2 sudo[122826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:14.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:14 compute-2 ceph-mon[77142]: pgmap v502: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:14 compute-2 ceph-mon[77142]: pgmap v503: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:15 compute-2 python3.9[122828]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:29:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:15.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:16 compute-2 sudo[122826]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:16 compute-2 ceph-mon[77142]: pgmap v504: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:16.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:17.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:18 compute-2 sudo[122980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlglqpoqaievfpgvvorzaysoeysjbwir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397757.5217144-204-215377086433476/AnsiballZ_systemd.py'
Nov 29 06:29:18 compute-2 sudo[122980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:18 compute-2 python3.9[122982]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:29:18 compute-2 sudo[122980]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:29:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 1366 writes, 9395 keys, 1366 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 1366 writes, 1366 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1366 writes, 9395 keys, 1366 commit groups, 1.0 writes per commit group, ingest: 19.37 MB, 0.03 MB/s
                                           Interval WAL: 1366 writes, 1366 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    131.8      0.08              0.02         3    0.028       0      0       0.0       0.0
                                             L6      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.7    154.5    139.3      0.13              0.05         2    0.067    8390    736       0.0       0.0
                                            Sum      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     95.2    136.5      0.22              0.07         5    0.044    8390    736       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     96.0    137.6      0.22              0.07         4    0.054    8390    736       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    154.5    139.3      0.13              0.05         2    0.067    8390    736       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    134.6      0.08              0.02         2    0.041       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.011, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.2 seconds
                                           Interval compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.2 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 304.00 MB usage: 767.56 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 6.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(32,660.91 KB,0.212308%) FilterBlock(5,31.98 KB,0.0102746%) IndexBlock(5,74.67 KB,0.0239874%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:29:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:18.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:19.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:19 compute-2 sudo[123136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igsmtlbpypcsdzwazhyhwhjlrmwsqsjo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397758.9257736-228-139377101979212/AnsiballZ_edpm_nftables_snippet.py'
Nov 29 06:29:19 compute-2 sudo[123136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:19 compute-2 python3[123138]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 29 06:29:19 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:19 compute-2 sudo[123136]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:20 compute-2 ceph-mon[77142]: pgmap v505: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:20.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:21 compute-2 sudo[123289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeibupoksbqtuljfqytzzhnresbptksa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.0076826-255-203948997477606/AnsiballZ_file.py'
Nov 29 06:29:21 compute-2 sudo[123289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:21.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:21 compute-2 python3.9[123291]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:21 compute-2 sudo[123289]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:22 compute-2 sudo[123441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmydfqfjhxdkqzzzjuomtqoemsqicpkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.722697-280-187362346872323/AnsiballZ_stat.py'
Nov 29 06:29:22 compute-2 sudo[123441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:22 compute-2 ceph-mon[77142]: pgmap v506: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:22 compute-2 ceph-mon[77142]: pgmap v507: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:22 compute-2 python3.9[123443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:22 compute-2 sudo[123441]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:22 compute-2 sudo[123520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxovzavcuoarukuesdgiteadkipjqtdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.722697-280-187362346872323/AnsiballZ_file.py'
Nov 29 06:29:22 compute-2 sudo[123520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:22 compute-2 python3.9[123522]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:22.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:22 compute-2 sudo[123520]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:22 compute-2 sudo[123523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:22 compute-2 sudo[123523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:22 compute-2 sudo[123523]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:22 compute-2 sudo[123572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:22 compute-2 sudo[123572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:22 compute-2 sudo[123572]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:23 compute-2 sudo[123722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eivxpbvojboyfisxusukqvxmudzfkmuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397763.059761-315-274160858355330/AnsiballZ_stat.py'
Nov 29 06:29:23 compute-2 sudo[123722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:23.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:23 compute-2 python3.9[123724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:23 compute-2 sudo[123722]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:23 compute-2 sudo[123800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfltmxvexyncvdicypylasuowuzopsox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397763.059761-315-274160858355330/AnsiballZ_file.py'
Nov 29 06:29:23 compute-2 sudo[123800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:23 compute-2 python3.9[123802]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.x3ahtaus recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:23 compute-2 sudo[123800]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:24 compute-2 ceph-mon[77142]: pgmap v508: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:24.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:24 compute-2 sudo[123953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpxouysfwqjsswnzamvrexfydgxzqgtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397764.576823-351-95938854450437/AnsiballZ_stat.py'
Nov 29 06:29:24 compute-2 sudo[123953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:25 compute-2 python3.9[123955]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:25 compute-2 sudo[123953]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:25.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:25 compute-2 sudo[124031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kktsvgbysxpixcupqaqxeoshqjmpxwwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397764.576823-351-95938854450437/AnsiballZ_file.py'
Nov 29 06:29:25 compute-2 sudo[124031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:25 compute-2 python3.9[124033]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:25 compute-2 sudo[124031]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:26 compute-2 sudo[124183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdgxsveizzxcxqgwszffsvtnfsqxmkah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397765.856739-390-40964597413060/AnsiballZ_command.py'
Nov 29 06:29:26 compute-2 sudo[124183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:26 compute-2 python3.9[124185]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:26 compute-2 sudo[124183]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:26.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:27.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:27 compute-2 sudo[124337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otnzkcvsytyanszdxuwvbeyjpxdixxos ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397767.459928-414-55653160734430/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:29:27 compute-2 sudo[124337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:28 compute-2 python3[124339]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:29:28 compute-2 ceph-mon[77142]: pgmap v509: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:28 compute-2 sudo[124337]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:28 compute-2 sudo[124490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgepnlshtbthshoxeqtvsteicgoggwap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397768.4435725-438-159037064962300/AnsiballZ_stat.py'
Nov 29 06:29:28 compute-2 sudo[124490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:28.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:28 compute-2 python3.9[124492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:29 compute-2 sudo[124490]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:29.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:29 compute-2 sudo[124615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nagkhixumzixdeghslksdpatudsvwjmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397768.4435725-438-159037064962300/AnsiballZ_copy.py'
Nov 29 06:29:29 compute-2 sudo[124615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:29 compute-2 ceph-mon[77142]: pgmap v510: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:29 compute-2 python3.9[124617]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397768.4435725-438-159037064962300/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:29 compute-2 sudo[124615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:30 compute-2 sudo[124768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znasklwqvpziszrkwgmzogkymaotwvzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397770.19629-483-20650074064733/AnsiballZ_stat.py'
Nov 29 06:29:30 compute-2 sudo[124768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:30.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:31 compute-2 python3.9[124770]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:31 compute-2 sudo[124768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:31.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:31 compute-2 sudo[124893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aovwvqmssduxcayrcmpjyfukkxwoafuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397770.19629-483-20650074064733/AnsiballZ_copy.py'
Nov 29 06:29:31 compute-2 sudo[124893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:31 compute-2 python3.9[124895]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397770.19629-483-20650074064733/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:31 compute-2 sudo[124893]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:32 compute-2 sudo[125045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfdhtxeumijrgeokminjmgusoajncmcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397772.0220683-529-179184675933415/AnsiballZ_stat.py'
Nov 29 06:29:32 compute-2 sudo[125045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:32 compute-2 python3.9[125047]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:32 compute-2 sudo[125045]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:32 compute-2 sudo[125171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oahptlhkhkzhpecmekvjhsqzedxrdhob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397772.0220683-529-179184675933415/AnsiballZ_copy.py'
Nov 29 06:29:32 compute-2 sudo[125171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:29:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:32.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:29:33 compute-2 python3.9[125173]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397772.0220683-529-179184675933415/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:33 compute-2 sudo[125171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:33.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:34 compute-2 sudo[125323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmngxptgrigsnoajzjhbnnkhagdcqxjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397773.7340424-574-126815727553777/AnsiballZ_stat.py'
Nov 29 06:29:34 compute-2 sudo[125323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:34 compute-2 python3.9[125325]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:34 compute-2 sudo[125323]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:34 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:34 compute-2 ceph-mon[77142]: pgmap v511: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:34.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:34 compute-2 ceph-mon[77142]: pgmap v512: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:34 compute-2 ceph-mon[77142]: pgmap v513: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:35 compute-2 sudo[125449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxgcgceljdvmxhwadcpwficemcglyhyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397773.7340424-574-126815727553777/AnsiballZ_copy.py'
Nov 29 06:29:35 compute-2 sudo[125449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:35 compute-2 python3.9[125451]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397773.7340424-574-126815727553777/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:35 compute-2 sudo[125449]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:35.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:35 compute-2 sudo[125601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntyxjeaxthmumbnuvacryexgmsetzeum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397775.5305605-619-232983562042340/AnsiballZ_stat.py'
Nov 29 06:29:35 compute-2 sudo[125601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:36 compute-2 python3.9[125603]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:36 compute-2 sudo[125601]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:36 compute-2 sudo[125727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tenbvlnoyakxstfkxffctgrjufvrtmcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397775.5305605-619-232983562042340/AnsiballZ_copy.py'
Nov 29 06:29:36 compute-2 sudo[125727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:36.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:37 compute-2 python3.9[125729]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397775.5305605-619-232983562042340/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:37 compute-2 sudo[125727]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:37.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:37 compute-2 sudo[125879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfxbioebqavuphiovlnsosfscdmesats ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397777.283501-664-169324103345822/AnsiballZ_file.py'
Nov 29 06:29:37 compute-2 sudo[125879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:37 compute-2 python3.9[125881]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:37 compute-2 sudo[125879]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:38 compute-2 sudo[126032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czssdzirxicojwmaqgqncythlsvfnjpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397778.1423593-687-196315067775915/AnsiballZ_command.py'
Nov 29 06:29:38 compute-2 sudo[126032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:38 compute-2 python3.9[126034]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:38 compute-2 ceph-mon[77142]: pgmap v514: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:38 compute-2 sudo[126032]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:38.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:39.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:39 compute-2 sudo[126187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrvvaxkfhmrfbymdjytwqpqvqqqmrkti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397778.9564004-711-81924760201110/AnsiballZ_blockinfile.py'
Nov 29 06:29:39 compute-2 sudo[126187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:39 compute-2 python3.9[126189]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:39 compute-2 sudo[126187]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:39 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:40 compute-2 sudo[126340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcxuhtufxvmgkvorgjabeifwypurqpzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397780.0740829-738-92352112931977/AnsiballZ_command.py'
Nov 29 06:29:40 compute-2 sudo[126340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:40 compute-2 python3.9[126342]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:40 compute-2 sudo[126340]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:40.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:41 compute-2 sudo[126493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noerunjpovxsgsviuhpcempudexcavmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397780.8840954-763-178657521443001/AnsiballZ_stat.py'
Nov 29 06:29:41 compute-2 sudo[126493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:41.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:42 compute-2 python3.9[126495]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:29:42 compute-2 sudo[126493]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-2 sudo[126498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:42 compute-2 sudo[126498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-2 sudo[126498]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-2 sudo[126535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:29:42 compute-2 sudo[126535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-2 sudo[126535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-2 sudo[126572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:42 compute-2 sudo[126572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-2 sudo[126572]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-2 sudo[126621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:29:42 compute-2 sudo[126621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-2 ceph-mon[77142]: pgmap v515: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:42 compute-2 ceph-mon[77142]: pgmap v516: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:42 compute-2 sudo[126768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkwvtlkpiehbhtwpgamulgpmdzwiqnrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397782.3184695-787-273752261537918/AnsiballZ_command.py'
Nov 29 06:29:42 compute-2 sudo[126768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:42 compute-2 sudo[126621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-2 python3.9[126770]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:42 compute-2 sudo[126768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:42.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:43 compute-2 sudo[126798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:43 compute-2 sudo[126798]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:43 compute-2 sudo[126798]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:43 compute-2 sudo[126823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:43 compute-2 sudo[126823]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:43 compute-2 sudo[126823]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:43.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:43 compute-2 sudo[126973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzwuoxhxejrpohcitdwxlrpyerkbrxif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397783.2205172-810-7506033702645/AnsiballZ_file.py'
Nov 29 06:29:43 compute-2 sudo[126973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:43 compute-2 python3.9[126975]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:43 compute-2 sudo[126973]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:44 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:29:44 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:44.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:45.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:45 compute-2 ceph-mon[77142]: pgmap v517: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:45 compute-2 ceph-mon[77142]: pgmap v518: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 06:29:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 06:29:46 compute-2 python3.9[127126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:46 compute-2 sudo[127129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:46 compute-2 sudo[127129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:46 compute-2 sudo[127129]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:46 compute-2 ceph-mon[77142]: pgmap v519: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:46 compute-2 sudo[127154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:29:46 compute-2 sudo[127154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:46 compute-2 sudo[127154]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:46 compute-2 sudo[127179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:46 compute-2 sudo[127179]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:46 compute-2 sudo[127179]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:46 compute-2 sudo[127204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:29:46 compute-2 sudo[127204]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:46.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:47.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:47 compute-2 ceph-mon[77142]: pgmap v520: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Nov 29 06:29:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:47 compute-2 podman[127326]: 2025-11-29 06:29:47.940451235 +0000 UTC m=+0.750553221 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Nov 29 06:29:48 compute-2 podman[127326]: 2025-11-29 06:29:48.038125393 +0000 UTC m=+0.848227369 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 06:29:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:29:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 18.68 MB, 0.03 MB/s
                                           Interval WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:29:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:48.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:49 compute-2 podman[127532]: 2025-11-29 06:29:49.25847617 +0000 UTC m=+0.166948270 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:29:49 compute-2 sudo[127637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spfusphdcgxpzeukmyelslsfcvwoyusj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397788.9715526-930-113297010311970/AnsiballZ_command.py'
Nov 29 06:29:49 compute-2 sudo[127637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:49 compute-2 podman[127605]: 2025-11-29 06:29:49.33104155 +0000 UTC m=+0.057260588 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:29:49 compute-2 podman[127532]: 2025-11-29 06:29:49.338567833 +0000 UTC m=+0.247039913 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:29:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:49.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:49 compute-2 ceph-mon[77142]: pgmap v521: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:49 compute-2 python3.9[127639]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:49 compute-2 ovs-vsctl[127672]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 29 06:29:49 compute-2 sudo[127637]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:49 compute-2 podman[127673]: 2025-11-29 06:29:49.645345527 +0000 UTC m=+0.135979583 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, description=keepalived for Ceph, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=Ceph keepalived, release=1793, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, architecture=x86_64)
Nov 29 06:29:49 compute-2 podman[127717]: 2025-11-29 06:29:49.719956842 +0000 UTC m=+0.053296720 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, version=2.2.4, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-type=git, architecture=x86_64, description=keepalived for Ceph)
Nov 29 06:29:49 compute-2 podman[127673]: 2025-11-29 06:29:49.756465128 +0000 UTC m=+0.247099164 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.buildah.version=1.28.2, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, architecture=x86_64, release=1793, com.redhat.component=keepalived-container, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, version=2.2.4)
Nov 29 06:29:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:49 compute-2 sudo[127204]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:50 compute-2 sudo[127856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilanxocubnwcmmneleisuobzuvsmlist ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397790.0864608-957-116609616647620/AnsiballZ_command.py'
Nov 29 06:29:50 compute-2 sudo[127856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:50 compute-2 python3.9[127858]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:50 compute-2 sudo[127856]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:50.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:51.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:51 compute-2 ceph-mon[77142]: pgmap v522: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:52 compute-2 sudo[127887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:52 compute-2 sudo[127887]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:52 compute-2 sudo[127887]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:52 compute-2 sudo[127912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:29:52 compute-2 sudo[127912]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:52 compute-2 sudo[127912]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:52 compute-2 sudo[127937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:52 compute-2 sudo[127937]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:52 compute-2 sudo[127937]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:52 compute-2 sudo[127962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:29:52 compute-2 sudo[127962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:52.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:52 compute-2 sudo[128129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ricvpwwyxjmnruhjjqtuudwlgnkmnpxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397792.7158437-981-150764864059022/AnsiballZ_command.py'
Nov 29 06:29:52 compute-2 sudo[128129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:53 compute-2 sudo[127962]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:53 compute-2 python3.9[128132]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:53 compute-2 ovs-vsctl[128144]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 29 06:29:53 compute-2 sudo[128129]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:53.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:53 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:53 compute-2 ceph-mon[77142]: pgmap v523: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:53 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:54 compute-2 python3.9[128294]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:29:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:29:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:29:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:29:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:29:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:29:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:54.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:54 compute-2 sudo[128447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcpmynkyjequwirendnkfsyduuxwnlar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397794.5770445-1033-126952883336189/AnsiballZ_file.py'
Nov 29 06:29:54 compute-2 sudo[128447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:55 compute-2 python3.9[128449]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:55 compute-2 sudo[128447]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:55.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:55 compute-2 ceph-mon[77142]: pgmap v524: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:55 compute-2 sudo[128599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqgbpmldxrozmyqoqvkrnuevucnwnthc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397795.4221416-1056-55590551332242/AnsiballZ_stat.py'
Nov 29 06:29:55 compute-2 sudo[128599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:56 compute-2 python3.9[128601]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:56 compute-2 sudo[128599]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:56 compute-2 sudo[128677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhurdwlmbfyxihogpefchbytbhvkjxyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397795.4221416-1056-55590551332242/AnsiballZ_file.py'
Nov 29 06:29:56 compute-2 sudo[128677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:56 compute-2 python3.9[128680]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:56 compute-2 sudo[128677]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:56.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:57 compute-2 sudo[128830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xppepdrizgdfgjqclyocckoynmrfwyci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397796.7479312-1056-11782959354066/AnsiballZ_stat.py'
Nov 29 06:29:57 compute-2 sudo[128830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:57 compute-2 python3.9[128832]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:57 compute-2 sudo[128830]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:57.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:57 compute-2 sudo[128908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyseivbuzokflnfiwsththhsnxomcwql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397796.7479312-1056-11782959354066/AnsiballZ_file.py'
Nov 29 06:29:57 compute-2 sudo[128908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:57 compute-2 python3.9[128910]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:57 compute-2 sudo[128908]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:58 compute-2 sudo[129061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzzdcfbpogtldnrphqfllzjvhwcrtxjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397798.1651795-1126-119764276791619/AnsiballZ_file.py'
Nov 29 06:29:58 compute-2 sudo[129061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:58 compute-2 python3.9[129063]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:58 compute-2 sudo[129061]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:58.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:29:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:59.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:59 compute-2 sudo[129213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lewkspckmkayfscytavlglzkpuknttyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397799.0213006-1150-163543387931246/AnsiballZ_stat.py'
Nov 29 06:29:59 compute-2 sudo[129213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:59 compute-2 ceph-mon[77142]: pgmap v525: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:59 compute-2 python3.9[129215]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:59 compute-2 sudo[129213]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:00 compute-2 sudo[129291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-napohafdabfujxzjuifflacqecqyfgvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397799.0213006-1150-163543387931246/AnsiballZ_file.py'
Nov 29 06:30:00 compute-2 sudo[129291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:00 compute-2 python3.9[129293]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:00 compute-2 sudo[129291]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:00 compute-2 sudo[129444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahtvpssboxfabnamvxfmiybguhblfxcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397800.5386944-1185-50965966883274/AnsiballZ_stat.py'
Nov 29 06:30:00 compute-2 sudo[129444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:00.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:01 compute-2 python3.9[129446]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:01 compute-2 sudo[129444]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:01 compute-2 sudo[129522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldejayckibvhnktyrsdztpqoafhcdrwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397800.5386944-1185-50965966883274/AnsiballZ_file.py'
Nov 29 06:30:01 compute-2 sudo[129522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:30:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:01.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:30:01 compute-2 python3.9[129524]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:01 compute-2 sudo[129522]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:01 compute-2 ceph-mon[77142]: pgmap v526: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:01 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:30:02 compute-2 sudo[129674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxlpwleucxzrwpkocoxbdpmeyseuojvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397801.865633-1222-229624891552937/AnsiballZ_systemd.py'
Nov 29 06:30:02 compute-2 sudo[129674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:02 compute-2 python3.9[129676]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:02 compute-2 systemd[1]: Reloading.
Nov 29 06:30:02 compute-2 systemd-rc-local-generator[129705]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:02 compute-2 systemd-sysv-generator[129710]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:02 compute-2 sudo[129674]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:02.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:03 compute-2 sudo[129791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:03 compute-2 sudo[129791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:03 compute-2 sudo[129791]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:03 compute-2 sudo[129839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:03 compute-2 sudo[129839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:03 compute-2 sudo[129839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:03.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:03 compute-2 sudo[129914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yntyktbgqlhlpmknkghdplcsdedlypsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397803.0923443-1246-268302772848801/AnsiballZ_stat.py'
Nov 29 06:30:03 compute-2 sudo[129914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:03 compute-2 ceph-mon[77142]: pgmap v527: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:03 compute-2 python3.9[129916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:03 compute-2 sudo[129914]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:03 compute-2 sudo[129992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvlzpbvroccfkkpbvalzsxtidtpezgtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397803.0923443-1246-268302772848801/AnsiballZ_file.py'
Nov 29 06:30:03 compute-2 sudo[129992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:04 compute-2 python3.9[129994]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:04 compute-2 sudo[129992]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:04 compute-2 sudo[130145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-selgnonygjbwthpukpxaxtgtlbgocpoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397804.5190234-1282-46468300447513/AnsiballZ_stat.py'
Nov 29 06:30:04 compute-2 sudo[130145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:04.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:05 compute-2 python3.9[130147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:05 compute-2 sudo[130145]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:05 compute-2 sudo[130223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikctuhdccpscdsejghgllnvegcrtpgqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397804.5190234-1282-46468300447513/AnsiballZ_file.py'
Nov 29 06:30:05 compute-2 sudo[130223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:05.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:05 compute-2 ceph-mon[77142]: pgmap v528: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:05 compute-2 python3.9[130225]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:05 compute-2 sudo[130223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:06 compute-2 sudo[130375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osgyvpzusxllumtvvhwbioewewdbhjfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397805.7671232-1318-24543016038629/AnsiballZ_systemd.py'
Nov 29 06:30:06 compute-2 sudo[130375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:06 compute-2 ceph-mon[77142]: pgmap v529: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:30:06 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:30:06 compute-2 python3.9[130377]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:06 compute-2 systemd[1]: Reloading.
Nov 29 06:30:06 compute-2 systemd-rc-local-generator[130404]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:06 compute-2 systemd-sysv-generator[130408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:06 compute-2 systemd[1]: Starting Create netns directory...
Nov 29 06:30:06 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:30:06 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:30:06 compute-2 systemd[1]: Finished Create netns directory.
Nov 29 06:30:06 compute-2 sudo[130375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:06.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:06 compute-2 sudo[130445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:06 compute-2 sudo[130445]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:06 compute-2 sudo[130445]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:07 compute-2 sudo[130470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:30:07 compute-2 sudo[130470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:07 compute-2 sudo[130470]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:07.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:07 compute-2 sudo[130620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvuzwfxcixpmpknangwfogshoqfmbqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397807.3420098-1348-260737789501159/AnsiballZ_file.py'
Nov 29 06:30:07 compute-2 sudo[130620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:07 compute-2 python3.9[130622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:07 compute-2 sudo[130620]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:08 compute-2 sudo[130773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpqrgaqbxznazeiakrblawgermqpujcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397808.1627903-1371-62291109263272/AnsiballZ_stat.py'
Nov 29 06:30:08 compute-2 sudo[130773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:08 compute-2 python3.9[130775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:08 compute-2 sudo[130773]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:08.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:08 compute-2 sudo[130896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofcbbmfgopfjygupszpalpscybqztbvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397808.1627903-1371-62291109263272/AnsiballZ_copy.py'
Nov 29 06:30:08 compute-2 sudo[130896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:09 compute-2 python3.9[130898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397808.1627903-1371-62291109263272/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:09 compute-2 sudo[130896]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:09.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:09 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:10 compute-2 sudo[131048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qykydrqopcvriifumxlcdeqrnekvqvxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397809.8187118-1423-62652477271704/AnsiballZ_file.py'
Nov 29 06:30:10 compute-2 sudo[131048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:10 compute-2 python3.9[131050]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:10 compute-2 sudo[131048]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:10.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:10 compute-2 sudo[131201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwoasugjrteievachqexpxtetsitypvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397810.6854649-1446-234358258156707/AnsiballZ_stat.py'
Nov 29 06:30:10 compute-2 sudo[131201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:11 compute-2 python3.9[131203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:11 compute-2 sudo[131201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:11.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:11 compute-2 sudo[131324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewmqncpjggmhpvefdpxgcqfcjtlucqxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397810.6854649-1446-234358258156707/AnsiballZ_copy.py'
Nov 29 06:30:11 compute-2 sudo[131324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:11 compute-2 python3.9[131326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397810.6854649-1446-234358258156707/.source.json _original_basename=.b_0l47u2 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:11 compute-2 sudo[131324]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:12 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:30:12 compute-2 sudo[131477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewguputduhcbafaihipuzdpxysmkpxyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397812.158573-1492-233362876635970/AnsiballZ_file.py'
Nov 29 06:30:12 compute-2 sudo[131477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:12 compute-2 python3.9[131479]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:12 compute-2 sudo[131477]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:12.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:13 compute-2 sudo[131629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crujwewatazklctxnmkzfpqvfrzicukg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397813.042983-1515-88011055666635/AnsiballZ_stat.py'
Nov 29 06:30:13 compute-2 sudo[131629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:13.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:13 compute-2 sudo[131629]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:13 compute-2 sudo[131752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbrsgellwovcghxrqcinhqgmpdnfvmej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397813.042983-1515-88011055666635/AnsiballZ_copy.py'
Nov 29 06:30:13 compute-2 sudo[131752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:14 compute-2 sudo[131752]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:14 compute-2 ceph-mon[77142]: pgmap v530: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:14 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:14.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:15 compute-2 sudo[131906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pphdqntbecafbahnvtyukfqqgdsvufpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397814.9447577-1566-77590316834933/AnsiballZ_container_config_data.py'
Nov 29 06:30:15 compute-2 sudo[131906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:15.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:15 compute-2 python3.9[131908]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 29 06:30:15 compute-2 sudo[131906]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:16 compute-2 sudo[132059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygssoiejnzjomhwoytsstjjleaetxqbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397815.9582431-1593-50269227583022/AnsiballZ_container_config_hash.py'
Nov 29 06:30:16 compute-2 sudo[132059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:16 compute-2 python3.9[132061]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:30:16 compute-2 sudo[132059]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:16.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:17 compute-2 ceph-mon[77142]: pgmap v531: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-2 ceph-mon[77142]: pgmap v532: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-2 ceph-mon[77142]: pgmap v533: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-2 ceph-mon[77142]: pgmap v534: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:17.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:17 compute-2 sudo[132211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmrlcmhdyzxrkfszbyivxidxpcxhtowr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397817.054295-1620-112225838841797/AnsiballZ_podman_container_info.py'
Nov 29 06:30:17 compute-2 sudo[132211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:17 compute-2 python3.9[132213]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:30:17 compute-2 sudo[132211]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:18 compute-2 ceph-mon[77142]: pgmap v535: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:18.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:19.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:19 compute-2 sudo[132390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojhocaimreilheiytkkqvenhswqavgwx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397818.9705687-1659-190965386712258/AnsiballZ_edpm_container_manage.py'
Nov 29 06:30:19 compute-2 sudo[132390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:19 compute-2 ceph-mon[77142]: pgmap v536: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:19 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:19 compute-2 python3[132392]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:30:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:20.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:21.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:21 compute-2 ceph-mon[77142]: pgmap v537: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:22 compute-2 ceph-mon[77142]: pgmap v538: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:22.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:23 compute-2 sudo[132457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:23 compute-2 sudo[132457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:23 compute-2 sudo[132457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:23.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:23 compute-2 sudo[132496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:23 compute-2 sudo[132496]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:23 compute-2 sudo[132496]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:24.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:25.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:26.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:27.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:27 compute-2 podman[132406]: 2025-11-29 06:30:27.560232244 +0000 UTC m=+7.634881301 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:27 compute-2 podman[132578]: 2025-11-29 06:30:27.676805423 +0000 UTC m=+0.020905146 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:28 compute-2 ceph-mon[77142]: pgmap v539: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:28 compute-2 podman[132578]: 2025-11-29 06:30:28.431236407 +0000 UTC m=+0.775336110 container create 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 06:30:28 compute-2 python3[132392]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:28 compute-2 sudo[132390]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:28.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:29.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:29 compute-2 ceph-mon[77142]: pgmap v540: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:29 compute-2 ceph-mon[77142]: pgmap v541: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:29 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:30 compute-2 sudo[132767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eutddranrpvkspvvcrowaqrccpyppqnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397830.1055377-1684-90922951328127/AnsiballZ_stat.py'
Nov 29 06:30:30 compute-2 sudo[132767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:30 compute-2 python3.9[132769]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:30 compute-2 sudo[132767]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:30.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:31 compute-2 sudo[132921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsptgkoqafxtvorxvmncetjfurxfptqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397831.0233674-1710-178680506133044/AnsiballZ_file.py'
Nov 29 06:30:31 compute-2 sudo[132921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:31.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:32.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:33.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:34 compute-2 python3.9[132923]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:34 compute-2 sudo[132921]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:34 compute-2 sudo[133001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmqxztmcqpatuomreqbesldcdwwafzqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397831.0233674-1710-178680506133044/AnsiballZ_stat.py'
Nov 29 06:30:34 compute-2 sudo[133001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:34 compute-2 python3.9[133003]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:34 compute-2 sudo[133001]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:34.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:35 compute-2 sudo[133152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppyxhmailrkugmvpnroukybzbhzmavcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397834.9574444-1710-187878217196123/AnsiballZ_copy.py'
Nov 29 06:30:35 compute-2 sudo[133152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:35.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:35 compute-2 python3.9[133154]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397834.9574444-1710-187878217196123/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:35 compute-2 sudo[133152]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:36 compute-2 sudo[133228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzrbfqytxzvbyeecevylmelvtvoaarjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397834.9574444-1710-187878217196123/AnsiballZ_systemd.py'
Nov 29 06:30:36 compute-2 sudo[133228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:36 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:30:36 compute-2 python3.9[133230]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:30:36 compute-2 systemd[1]: Reloading.
Nov 29 06:30:36 compute-2 systemd-rc-local-generator[133257]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:36 compute-2 systemd-sysv-generator[133262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:36.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:37 compute-2 sudo[133228]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:37.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:37 compute-2 sudo[133341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qahhtxspoeiqljsjosezwgynpjbhioee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397834.9574444-1710-187878217196123/AnsiballZ_systemd.py'
Nov 29 06:30:37 compute-2 sudo[133341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:38 compute-2 python3.9[133343]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:38 compute-2 systemd[1]: Reloading.
Nov 29 06:30:38 compute-2 systemd-sysv-generator[133374]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:38 compute-2 systemd-rc-local-generator[133366]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:38 compute-2 ceph-mon[77142]: pgmap v542: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:38 compute-2 systemd[1]: Starting ovn_controller container...
Nov 29 06:30:38 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:30:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af8eceb0d62786f3349e40b8f178df504877b374937d019d99a45125a9ac3338/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 06:30:38 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947.
Nov 29 06:30:38 compute-2 podman[133386]: 2025-11-29 06:30:38.887306839 +0000 UTC m=+0.132132740 container init 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:30:38 compute-2 ovn_controller[133401]: + sudo -E kolla_set_configs
Nov 29 06:30:38 compute-2 podman[133386]: 2025-11-29 06:30:38.920162756 +0000 UTC m=+0.164988657 container start 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:30:38 compute-2 edpm-start-podman-container[133386]: ovn_controller
Nov 29 06:30:38 compute-2 systemd[1]: Created slice User Slice of UID 0.
Nov 29 06:30:38 compute-2 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 29 06:30:38 compute-2 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 29 06:30:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:38.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:38 compute-2 systemd[1]: Starting User Manager for UID 0...
Nov 29 06:30:38 compute-2 systemd[133441]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 29 06:30:38 compute-2 edpm-start-podman-container[133385]: Creating additional drop-in dependency for "ovn_controller" (4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947)
Nov 29 06:30:38 compute-2 podman[133408]: 2025-11-29 06:30:38.999155259 +0000 UTC m=+0.069355284 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 29 06:30:39 compute-2 systemd[1]: 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947-2c86bc44c568e13f.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:30:39 compute-2 systemd[1]: 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947-2c86bc44c568e13f.service: Failed with result 'exit-code'.
Nov 29 06:30:39 compute-2 systemd[1]: Reloading.
Nov 29 06:30:39 compute-2 systemd-rc-local-generator[133489]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:39 compute-2 systemd-sysv-generator[133492]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:39 compute-2 systemd[133441]: Queued start job for default target Main User Target.
Nov 29 06:30:39 compute-2 systemd[133441]: Created slice User Application Slice.
Nov 29 06:30:39 compute-2 systemd[133441]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 29 06:30:39 compute-2 systemd[133441]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 06:30:39 compute-2 systemd[133441]: Reached target Paths.
Nov 29 06:30:39 compute-2 systemd[133441]: Reached target Timers.
Nov 29 06:30:39 compute-2 systemd[133441]: Starting D-Bus User Message Bus Socket...
Nov 29 06:30:39 compute-2 systemd[133441]: Starting Create User's Volatile Files and Directories...
Nov 29 06:30:39 compute-2 systemd[133441]: Listening on D-Bus User Message Bus Socket.
Nov 29 06:30:39 compute-2 systemd[133441]: Reached target Sockets.
Nov 29 06:30:39 compute-2 systemd[133441]: Finished Create User's Volatile Files and Directories.
Nov 29 06:30:39 compute-2 systemd[133441]: Reached target Basic System.
Nov 29 06:30:39 compute-2 systemd[133441]: Reached target Main User Target.
Nov 29 06:30:39 compute-2 systemd[133441]: Startup finished in 151ms.
Nov 29 06:30:39 compute-2 systemd[1]: Started User Manager for UID 0.
Nov 29 06:30:39 compute-2 systemd[1]: Started ovn_controller container.
Nov 29 06:30:39 compute-2 systemd[1]: Started Session c1 of User root.
Nov 29 06:30:39 compute-2 sudo[133341]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:39 compute-2 ceph-mon[77142]: pgmap v543: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:39 compute-2 ceph-mon[77142]: pgmap v544: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:39 compute-2 ceph-mon[77142]: pgmap v545: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:39 compute-2 ovn_controller[133401]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:30:39 compute-2 ovn_controller[133401]: INFO:__main__:Validating config file
Nov 29 06:30:39 compute-2 ovn_controller[133401]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:30:39 compute-2 ovn_controller[133401]: INFO:__main__:Writing out command to execute
Nov 29 06:30:39 compute-2 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: ++ cat /run_command
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + ARGS=
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + sudo kolla_copy_cacerts
Nov 29 06:30:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:39.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:39 compute-2 systemd[1]: Started Session c2 of User root.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + [[ ! -n '' ]]
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + . kolla_extend_start
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 29 06:30:39 compute-2 ovn_controller[133401]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + umask 0022
Nov 29 06:30:39 compute-2 ovn_controller[133401]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 29 06:30:39 compute-2 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5169] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5176] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5186] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5190] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5193] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 06:30:39 compute-2 kernel: br-int: entered promiscuous mode
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:39 compute-2 ovn_controller[133401]: 2025-11-29T06:30:39Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5398] manager: (ovn-2fa832-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5405] manager: (ovn-e15f55-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 29 06:30:39 compute-2 systemd-udevd[133535]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:30:39 compute-2 kernel: genev_sys_6081: entered promiscuous mode
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5556] device (genev_sys_6081): carrier: link connected
Nov 29 06:30:39 compute-2 NetworkManager[48989]: <info>  [1764397839.5559] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Nov 29 06:30:39 compute-2 systemd-udevd[133537]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:30:40 compute-2 NetworkManager[48989]: <info>  [1764397840.0798] manager: (ovn-93db78-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 29 06:30:40 compute-2 ceph-mon[77142]: pgmap v546: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:40 compute-2 sudo[133666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwdzlzwvttzktzczfigttpsxofanppqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397840.1216033-1794-114859441481357/AnsiballZ_command.py'
Nov 29 06:30:40 compute-2 sudo[133666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:40 compute-2 python3.9[133668]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:40 compute-2 ovs-vsctl[133669]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 29 06:30:40 compute-2 sudo[133666]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:40.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:41 compute-2 sudo[133819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfnuzskpgiqguldngmsdyspmobkudgxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397840.9408345-1819-124273571586262/AnsiballZ_command.py'
Nov 29 06:30:41 compute-2 sudo[133819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:41 compute-2 python3.9[133821]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:41 compute-2 ovs-vsctl[133823]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 29 06:30:41 compute-2 sudo[133819]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:41.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:42 compute-2 ceph-mon[77142]: pgmap v547: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:42 compute-2 sudo[133975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxyforgxxolrllbzhysredtrgeqxtln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397842.1053607-1860-273881691302090/AnsiballZ_command.py'
Nov 29 06:30:42 compute-2 sudo[133975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:42 compute-2 python3.9[133977]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:42 compute-2 ovs-vsctl[133978]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 29 06:30:42 compute-2 sudo[133975]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:42.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:43 compute-2 sshd-session[121822]: Connection closed by 192.168.122.30 port 49290
Nov 29 06:30:43 compute-2 sshd-session[121776]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:30:43 compute-2 systemd-logind[784]: Session 45 logged out. Waiting for processes to exit.
Nov 29 06:30:43 compute-2 systemd[1]: session-45.scope: Deactivated successfully.
Nov 29 06:30:43 compute-2 systemd[1]: session-45.scope: Consumed 57.493s CPU time.
Nov 29 06:30:43 compute-2 systemd-logind[784]: Removed session 45.
Nov 29 06:30:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:43.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:43 compute-2 ceph-mon[77142]: pgmap v548: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:43 compute-2 sudo[134003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:43 compute-2 sudo[134003]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:43 compute-2 sudo[134003]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:43 compute-2 sudo[134028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:43 compute-2 sudo[134028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:43 compute-2 sudo[134028]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:44.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:45.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:46 compute-2 ceph-mon[77142]: pgmap v549: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:46.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:47.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:48 compute-2 ceph-mon[77142]: pgmap v550: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:48 compute-2 sshd-session[134056]: Accepted publickey for zuul from 192.168.122.30 port 52788 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:30:48 compute-2 systemd-logind[784]: New session 47 of user zuul.
Nov 29 06:30:48 compute-2 systemd[1]: Started Session 47 of User zuul.
Nov 29 06:30:48 compute-2 sshd-session[134056]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:30:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:48.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:49 compute-2 sshd-session[134058]: error: maximum authentication attempts exceeded for root from 70.184.117.114 port 44710 ssh2 [preauth]
Nov 29 06:30:49 compute-2 sshd-session[134058]: Disconnecting authenticating user root 70.184.117.114 port 44710: Too many authentication failures [preauth]
Nov 29 06:30:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:49.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:49 compute-2 systemd[1]: Stopping User Manager for UID 0...
Nov 29 06:30:49 compute-2 systemd[133441]: Activating special unit Exit the Session...
Nov 29 06:30:49 compute-2 systemd[133441]: Stopped target Main User Target.
Nov 29 06:30:49 compute-2 systemd[133441]: Stopped target Basic System.
Nov 29 06:30:49 compute-2 systemd[133441]: Stopped target Paths.
Nov 29 06:30:49 compute-2 systemd[133441]: Stopped target Sockets.
Nov 29 06:30:49 compute-2 systemd[133441]: Stopped target Timers.
Nov 29 06:30:49 compute-2 systemd[133441]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 06:30:49 compute-2 systemd[133441]: Closed D-Bus User Message Bus Socket.
Nov 29 06:30:49 compute-2 systemd[133441]: Stopped Create User's Volatile Files and Directories.
Nov 29 06:30:49 compute-2 systemd[133441]: Removed slice User Application Slice.
Nov 29 06:30:49 compute-2 systemd[133441]: Reached target Shutdown.
Nov 29 06:30:49 compute-2 systemd[133441]: Finished Exit the Session.
Nov 29 06:30:49 compute-2 systemd[133441]: Reached target Exit the Session.
Nov 29 06:30:49 compute-2 systemd[1]: user@0.service: Deactivated successfully.
Nov 29 06:30:49 compute-2 systemd[1]: Stopped User Manager for UID 0.
Nov 29 06:30:49 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 29 06:30:49 compute-2 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 29 06:30:49 compute-2 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 29 06:30:49 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 29 06:30:49 compute-2 systemd[1]: Removed slice User Slice of UID 0.
Nov 29 06:30:49 compute-2 python3.9[134211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:30:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:50.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:51 compute-2 sshd-session[134212]: error: maximum authentication attempts exceeded for root from 70.184.117.114 port 44720 ssh2 [preauth]
Nov 29 06:30:51 compute-2 sshd-session[134212]: Disconnecting authenticating user root 70.184.117.114 port 44720: Too many authentication failures [preauth]
Nov 29 06:30:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:51.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:52 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:30:52 compute-2 sudo[134374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubzruwkwhwabvjejzepbqouomnzyxgqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397852.1214464-70-9675423435232/AnsiballZ_file.py'
Nov 29 06:30:52 compute-2 sudo[134374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:52 compute-2 sshd-session[134246]: error: maximum authentication attempts exceeded for root from 70.184.117.114 port 44732 ssh2 [preauth]
Nov 29 06:30:52 compute-2 sshd-session[134246]: Disconnecting authenticating user root 70.184.117.114 port 44732: Too many authentication failures [preauth]
Nov 29 06:30:52 compute-2 python3.9[134376]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:52 compute-2 sudo[134374]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:52.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:53 compute-2 sudo[134530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdosbmmxrsjfphywwvnrsjrkusogeaex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397853.000067-70-261725565903243/AnsiballZ_file.py'
Nov 29 06:30:53 compute-2 sudo[134530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:53 compute-2 sshd-session[134379]: Invalid user centos from 92.118.39.92 port 38722
Nov 29 06:30:53 compute-2 sshd-session[134379]: Connection closed by invalid user centos 92.118.39.92 port 38722 [preauth]
Nov 29 06:30:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:53.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:53 compute-2 python3.9[134532]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:53 compute-2 sudo[134530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:53 compute-2 ceph-mon[77142]: pgmap v551: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:53 compute-2 ceph-mon[77142]: pgmap v552: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:53 compute-2 ceph-mon[77142]: pgmap v553: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:53 compute-2 sudo[134682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szijdkfvmptahzpwexjyrsaupzqtmnrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397853.6699934-70-5346515281020/AnsiballZ_file.py'
Nov 29 06:30:53 compute-2 sudo[134682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:54 compute-2 sshd-session[134377]: Received disconnect from 70.184.117.114 port 44742:11: disconnected by user [preauth]
Nov 29 06:30:54 compute-2 sshd-session[134377]: Disconnected from authenticating user root 70.184.117.114 port 44742 [preauth]
Nov 29 06:30:54 compute-2 python3.9[134684]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:54 compute-2 sudo[134682]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:54 compute-2 sudo[134837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxbcjcycuirffxcryfptxjavvtohaeyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397854.4840145-70-133643980302256/AnsiballZ_file.py'
Nov 29 06:30:54 compute-2 sudo[134837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:54 compute-2 python3.9[134839]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:54.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:55 compute-2 sudo[134837]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:55 compute-2 ceph-mon[77142]: pgmap v554: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:55 compute-2 sshd-session[134685]: Invalid user admin from 70.184.117.114 port 44752
Nov 29 06:30:55 compute-2 sudo[134989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzeitspjiloemozcxbmnlnoeuunkbnjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397855.1607418-70-130572079422045/AnsiballZ_file.py'
Nov 29 06:30:55 compute-2 sudo[134989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:55.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:55 compute-2 python3.9[134991]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:55 compute-2 sudo[134989]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:55 compute-2 sshd-session[134685]: error: maximum authentication attempts exceeded for invalid user admin from 70.184.117.114 port 44752 ssh2 [preauth]
Nov 29 06:30:55 compute-2 sshd-session[134685]: Disconnecting invalid user admin 70.184.117.114 port 44752: Too many authentication failures [preauth]
Nov 29 06:30:56 compute-2 python3.9[135143]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:30:56 compute-2 sshd-session[135042]: Invalid user admin from 70.184.117.114 port 37496
Nov 29 06:30:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:56.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:57 compute-2 sshd-session[135042]: error: maximum authentication attempts exceeded for invalid user admin from 70.184.117.114 port 37496 ssh2 [preauth]
Nov 29 06:30:57 compute-2 sshd-session[135042]: Disconnecting invalid user admin 70.184.117.114 port 37496: Too many authentication failures [preauth]
Nov 29 06:30:57 compute-2 sudo[135294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cybhyhsoaxkryapnvsjzoztjnezcmvgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397856.8335612-201-229201560739191/AnsiballZ_seboolean.py'
Nov 29 06:30:57 compute-2 sudo[135294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:57.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:57 compute-2 python3.9[135296]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 06:30:57 compute-2 ceph-mon[77142]: pgmap v555: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:58 compute-2 sshd-session[135297]: Invalid user admin from 70.184.117.114 port 37506
Nov 29 06:30:58 compute-2 sudo[135294]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:58 compute-2 sshd-session[135297]: Received disconnect from 70.184.117.114 port 37506:11: disconnected by user [preauth]
Nov 29 06:30:58 compute-2 sshd-session[135297]: Disconnected from invalid user admin 70.184.117.114 port 37506 [preauth]
Nov 29 06:30:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:58.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:59 compute-2 python3.9[135452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:59 compute-2 sshd-session[135377]: Invalid user oracle from 70.184.117.114 port 37518
Nov 29 06:30:59 compute-2 ceph-mon[77142]: pgmap v556: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:30:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:59.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:59 compute-2 sshd-session[135377]: error: maximum authentication attempts exceeded for invalid user oracle from 70.184.117.114 port 37518 ssh2 [preauth]
Nov 29 06:30:59 compute-2 sshd-session[135377]: Disconnecting invalid user oracle 70.184.117.114 port 37518: Too many authentication failures [preauth]
Nov 29 06:30:59 compute-2 python3.9[135573]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397858.4530733-225-200257258328877/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:00 compute-2 python3.9[135725]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:00 compute-2 sshd-session[135577]: Invalid user oracle from 70.184.117.114 port 37534
Nov 29 06:31:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:00 compute-2 python3.9[135847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397859.9660137-270-18791181791034/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:01.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:01.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:01 compute-2 ceph-mon[77142]: pgmap v557: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:01 compute-2 sshd-session[135577]: error: maximum authentication attempts exceeded for invalid user oracle from 70.184.117.114 port 37534 ssh2 [preauth]
Nov 29 06:31:01 compute-2 sshd-session[135577]: Disconnecting invalid user oracle 70.184.117.114 port 37534: Too many authentication failures [preauth]
Nov 29 06:31:01 compute-2 sudo[135997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdbbayjcdhlzyoravvgxabtirgyxfbgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397861.577899-321-188621588628549/AnsiballZ_setup.py'
Nov 29 06:31:01 compute-2 sudo[135997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:02 compute-2 python3.9[135999]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:31:02 compute-2 sudo[135997]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:02 compute-2 sudo[136084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdynenswbfcqyytgbsafbjguzhkmspma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397861.577899-321-188621588628549/AnsiballZ_dnf.py'
Nov 29 06:31:02 compute-2 sudo[136084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:02 compute-2 sshd-session[136000]: Invalid user oracle from 70.184.117.114 port 37542
Nov 29 06:31:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:03.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:03 compute-2 python3.9[136086]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:31:03 compute-2 sshd-session[136000]: Received disconnect from 70.184.117.114 port 37542:11: disconnected by user [preauth]
Nov 29 06:31:03 compute-2 sshd-session[136000]: Disconnected from invalid user oracle 70.184.117.114 port 37542 [preauth]
Nov 29 06:31:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:03.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:03 compute-2 ceph-mon[77142]: pgmap v558: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 6.2 KiB/s rd, 0 B/s wr, 10 op/s
Nov 29 06:31:03 compute-2 sudo[136090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:03 compute-2 sudo[136090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:03 compute-2 sudo[136090]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:03 compute-2 sudo[136115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:03 compute-2 sudo[136115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:03 compute-2 sudo[136115]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:03 compute-2 sshd-session[136088]: Invalid user usuario from 70.184.117.114 port 37558
Nov 29 06:31:04 compute-2 sshd-session[136088]: error: maximum authentication attempts exceeded for invalid user usuario from 70.184.117.114 port 37558 ssh2 [preauth]
Nov 29 06:31:04 compute-2 sshd-session[136088]: Disconnecting invalid user usuario 70.184.117.114 port 37558: Too many authentication failures [preauth]
Nov 29 06:31:04 compute-2 sudo[136084]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:05.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:05.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:05 compute-2 sshd-session[136141]: Invalid user usuario from 70.184.117.114 port 37574
Nov 29 06:31:05 compute-2 sudo[136292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhxfsbdrcewdojytxfcuxezpfwwgnvwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397864.9609983-357-93119704518985/AnsiballZ_systemd.py'
Nov 29 06:31:05 compute-2 sudo[136292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:05 compute-2 ceph-mon[77142]: pgmap v559: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 40 KiB/s rd, 0 B/s wr, 66 op/s
Nov 29 06:31:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:05 compute-2 python3.9[136294]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:31:05 compute-2 sshd-session[136141]: error: maximum authentication attempts exceeded for invalid user usuario from 70.184.117.114 port 37574 ssh2 [preauth]
Nov 29 06:31:05 compute-2 sshd-session[136141]: Disconnecting invalid user usuario 70.184.117.114 port 37574: Too many authentication failures [preauth]
Nov 29 06:31:05 compute-2 sudo[136292]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:06 compute-2 python3.9[136450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:06 compute-2 sshd-session[136322]: Invalid user usuario from 70.184.117.114 port 42130
Nov 29 06:31:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:07.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:07 compute-2 sshd-session[136322]: Received disconnect from 70.184.117.114 port 42130:11: disconnected by user [preauth]
Nov 29 06:31:07 compute-2 sshd-session[136322]: Disconnected from invalid user usuario 70.184.117.114 port 42130 [preauth]
Nov 29 06:31:07 compute-2 sudo[136572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:07 compute-2 sudo[136572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-2 sudo[136572]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:07 compute-2 python3.9[136571]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397866.1569865-381-61470608382686/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:07 compute-2 sudo[136597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:31:07 compute-2 sudo[136597]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-2 sudo[136597]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:07 compute-2 sudo[136624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:07 compute-2 sudo[136624]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-2 sudo[136624]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:07 compute-2 sudo[136673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:31:07 compute-2 sudo[136673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:07.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:07 compute-2 python3.9[136875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:07 compute-2 podman[136896]: 2025-11-29 06:31:07.851468329 +0000 UTC m=+0.092945522 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:31:07 compute-2 podman[136896]: 2025-11-29 06:31:07.98111456 +0000 UTC m=+0.222591693 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:31:08 compute-2 sshd-session[136610]: Invalid user test from 70.184.117.114 port 42144
Nov 29 06:31:08 compute-2 ceph-mon[77142]: pgmap v560: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 40 KiB/s rd, 0 B/s wr, 66 op/s
Nov 29 06:31:08 compute-2 python3.9[137089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397867.2855337-381-110600878901683/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:08 compute-2 podman[137199]: 2025-11-29 06:31:08.570419304 +0000 UTC m=+0.068645665 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:31:08 compute-2 sshd-session[136610]: error: maximum authentication attempts exceeded for invalid user test from 70.184.117.114 port 42144 ssh2 [preauth]
Nov 29 06:31:08 compute-2 sshd-session[136610]: Disconnecting invalid user test 70.184.117.114 port 42144: Too many authentication failures [preauth]
Nov 29 06:31:08 compute-2 podman[137199]: 2025-11-29 06:31:08.608235196 +0000 UTC m=+0.106461527 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:31:08 compute-2 podman[137266]: 2025-11-29 06:31:08.878303069 +0000 UTC m=+0.077152474 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, build-date=2023-02-22T09:23:20, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, architecture=x86_64, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 06:31:08 compute-2 podman[137266]: 2025-11-29 06:31:08.899199724 +0000 UTC m=+0.098049089 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, build-date=2023-02-22T09:23:20, release=1793, vendor=Red Hat, Inc., version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Nov 29 06:31:08 compute-2 sudo[136673]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:09.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:09 compute-2 sudo[137300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:09 compute-2 sudo[137300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:09 compute-2 sudo[137300]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:09 compute-2 sudo[137326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:31:09 compute-2 sudo[137326]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:09 compute-2 sudo[137326]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:09 compute-2 ovn_controller[133401]: 2025-11-29T06:31:09Z|00025|memory|INFO|16128 kB peak resident set size after 29.9 seconds
Nov 29 06:31:09 compute-2 ovn_controller[133401]: 2025-11-29T06:31:09Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 29 06:31:09 compute-2 podman[137324]: 2025-11-29 06:31:09.384429808 +0000 UTC m=+0.115711526 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:31:09 compute-2 sudo[137367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:09 compute-2 sudo[137367]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:09 compute-2 sudo[137367]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:09 compute-2 ceph-mon[77142]: pgmap v561: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 67 KiB/s rd, 0 B/s wr, 112 op/s
Nov 29 06:31:09 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:09 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:09 compute-2 sudo[137401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:31:09 compute-2 sudo[137401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:09.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:09 compute-2 sudo[137401]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:10 compute-2 python3.9[137570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:10 compute-2 sshd-session[137252]: Invalid user test from 70.184.117.114 port 42150
Nov 29 06:31:10 compute-2 sshd-session[137252]: error: maximum authentication attempts exceeded for invalid user test from 70.184.117.114 port 42150 ssh2 [preauth]
Nov 29 06:31:10 compute-2 sshd-session[137252]: Disconnecting invalid user test 70.184.117.114 port 42150: Too many authentication failures [preauth]
Nov 29 06:31:10 compute-2 python3.9[137704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397869.5004444-513-211110170257389/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:31:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:31:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:31:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:31:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:31:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:31:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:11.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:11 compute-2 python3.9[137856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:11.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:11 compute-2 sshd-session[137705]: Invalid user test from 70.184.117.114 port 42158
Nov 29 06:31:11 compute-2 python3.9[137977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397870.8714752-513-155870918028675/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:11 compute-2 sshd-session[137705]: Received disconnect from 70.184.117.114 port 42158:11: disconnected by user [preauth]
Nov 29 06:31:11 compute-2 sshd-session[137705]: Disconnected from invalid user test 70.184.117.114 port 42158 [preauth]
Nov 29 06:31:12 compute-2 ceph-mon[77142]: pgmap v562: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 75 KiB/s rd, 0 B/s wr, 124 op/s
Nov 29 06:31:12 compute-2 python3.9[138130]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:31:12 compute-2 sshd-session[137981]: Invalid user user from 70.184.117.114 port 42162
Nov 29 06:31:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:13.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:13 compute-2 sudo[138282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iapdywyuyvmachibikrnjjxwucljtrkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.102263-627-106741947643668/AnsiballZ_file.py'
Nov 29 06:31:13 compute-2 sudo[138282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:13.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:13 compute-2 sshd-session[137981]: error: maximum authentication attempts exceeded for invalid user user from 70.184.117.114 port 42162 ssh2 [preauth]
Nov 29 06:31:13 compute-2 sshd-session[137981]: Disconnecting invalid user user 70.184.117.114 port 42162: Too many authentication failures [preauth]
Nov 29 06:31:13 compute-2 python3.9[138284]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:13 compute-2 sudo[138282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:14 compute-2 sudo[138436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwzkbcoinzrthkwfiapppgdlyegmrxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.9004772-652-136687936101265/AnsiballZ_stat.py'
Nov 29 06:31:14 compute-2 sudo[138436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:14 compute-2 python3.9[138438]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:14 compute-2 sudo[138436]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:14 compute-2 sudo[138515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daikdsyrqwkaocptqkrnkdyshjuiknyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.9004772-652-136687936101265/AnsiballZ_file.py'
Nov 29 06:31:14 compute-2 sudo[138515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:14 compute-2 python3.9[138517]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:15 compute-2 sudo[138515]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:15.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:15 compute-2 sshd-session[138309]: Invalid user user from 70.184.117.114 port 42172
Nov 29 06:31:15 compute-2 sudo[138667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liexmjcotojaanwkaflzdzksvfpynskp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397875.1459715-652-118386950164056/AnsiballZ_stat.py'
Nov 29 06:31:15 compute-2 sudo[138667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:15 compute-2 sshd-session[138309]: error: maximum authentication attempts exceeded for invalid user user from 70.184.117.114 port 42172 ssh2 [preauth]
Nov 29 06:31:15 compute-2 sshd-session[138309]: Disconnecting invalid user user 70.184.117.114 port 42172: Too many authentication failures [preauth]
Nov 29 06:31:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:15 compute-2 python3.9[138669]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:15 compute-2 sudo[138667]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:15 compute-2 sudo[138747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzdcdaysttmevltqhstdvxbnqshaczzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397875.1459715-652-118386950164056/AnsiballZ_file.py'
Nov 29 06:31:15 compute-2 sudo[138747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:16 compute-2 python3.9[138749]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:16 compute-2 sudo[138747]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:16 compute-2 sudo[138900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koslrvgvknlzfgluobjoxiakumxgthjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397876.556366-720-231605734849234/AnsiballZ_file.py'
Nov 29 06:31:16 compute-2 sudo[138900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:17 compute-2 python3.9[138902]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:17.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:17 compute-2 sudo[138900]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:17 compute-2 sshd-session[138672]: Invalid user user from 70.184.117.114 port 42180
Nov 29 06:31:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:17.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:17 compute-2 sshd-session[138672]: Received disconnect from 70.184.117.114 port 42180:11: disconnected by user [preauth]
Nov 29 06:31:17 compute-2 sshd-session[138672]: Disconnected from invalid user user 70.184.117.114 port 42180 [preauth]
Nov 29 06:31:17 compute-2 sudo[139052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxdehyfmzylpqurvpvburajgyroozzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397877.406666-744-39739517313324/AnsiballZ_stat.py'
Nov 29 06:31:17 compute-2 sudo[139052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:17 compute-2 python3.9[139054]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:17 compute-2 sudo[139052]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:18 compute-2 sudo[139132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfekzrsskfaflgoaeppvgprvrjrwjnmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397877.406666-744-39739517313324/AnsiballZ_file.py'
Nov 29 06:31:18 compute-2 sudo[139132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:18 compute-2 python3.9[139134]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:18 compute-2 sudo[139132]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:18 compute-2 sshd-session[139055]: Invalid user ftpuser from 70.184.117.114 port 46520
Nov 29 06:31:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:19.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:19 compute-2 sshd-session[139055]: error: maximum authentication attempts exceeded for invalid user ftpuser from 70.184.117.114 port 46520 ssh2 [preauth]
Nov 29 06:31:19 compute-2 sshd-session[139055]: Disconnecting invalid user ftpuser 70.184.117.114 port 46520: Too many authentication failures [preauth]
Nov 29 06:31:19 compute-2 sudo[139287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnnbnsayhqtmnwkqhdrldqddmsvmohwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397879.0734966-780-21282867600367/AnsiballZ_stat.py'
Nov 29 06:31:19 compute-2 sudo[139287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:19.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:19 compute-2 python3.9[139289]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:19 compute-2 sudo[139287]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:19 compute-2 sudo[139365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsbfgobmtrpnuonzaslfsatacssxkvat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397879.0734966-780-21282867600367/AnsiballZ_file.py'
Nov 29 06:31:19 compute-2 sudo[139365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:19 compute-2 python3.9[139367]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:19 compute-2 sudo[139365]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:20 compute-2 sshd-session[139235]: Invalid user ftpuser from 70.184.117.114 port 46524
Nov 29 06:31:20 compute-2 ceph-mon[77142]: pgmap v563: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Nov 29 06:31:20 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:31:20 compute-2 sshd-session[139235]: error: maximum authentication attempts exceeded for invalid user ftpuser from 70.184.117.114 port 46524 ssh2 [preauth]
Nov 29 06:31:20 compute-2 sshd-session[139235]: Disconnecting invalid user ftpuser 70.184.117.114 port 46524: Too many authentication failures [preauth]
Nov 29 06:31:20 compute-2 sudo[139520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtsvkyibvlmqqefzvovhokudrtesphnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397880.4405048-817-184021401712103/AnsiballZ_systemd.py'
Nov 29 06:31:20 compute-2 sudo[139520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:21 compute-2 python3.9[139522]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:31:21 compute-2 systemd[1]: Reloading.
Nov 29 06:31:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:31:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:21.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:31:21 compute-2 systemd-sysv-generator[139555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:21 compute-2 systemd-rc-local-generator[139551]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:21 compute-2 sudo[139520]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:21 compute-2 sshd-session[139468]: Invalid user ftpuser from 70.184.117.114 port 46528
Nov 29 06:31:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:21.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:21 compute-2 ceph-mon[77142]: pgmap v564: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 88 KiB/s rd, 0 B/s wr, 146 op/s
Nov 29 06:31:21 compute-2 ceph-mon[77142]: pgmap v565: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 54 KiB/s rd, 0 B/s wr, 90 op/s
Nov 29 06:31:21 compute-2 ceph-mon[77142]: pgmap v566: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 54 KiB/s rd, 0 B/s wr, 90 op/s
Nov 29 06:31:21 compute-2 ceph-mon[77142]: pgmap v567: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 26 KiB/s rd, 0 B/s wr, 44 op/s
Nov 29 06:31:21 compute-2 sshd-session[139468]: Received disconnect from 70.184.117.114 port 46528:11: disconnected by user [preauth]
Nov 29 06:31:21 compute-2 sshd-session[139468]: Disconnected from invalid user ftpuser 70.184.117.114 port 46528 [preauth]
Nov 29 06:31:22 compute-2 sudo[139712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyleqpqiwtzuobeasncvapdbetxktnyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397881.8817556-840-138743382282103/AnsiballZ_stat.py'
Nov 29 06:31:22 compute-2 sudo[139712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:22 compute-2 python3.9[139714]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:22 compute-2 sudo[139712]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:22 compute-2 sudo[139791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwacdpaayieaaeffzedpslpwktfceyuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397881.8817556-840-138743382282103/AnsiballZ_file.py'
Nov 29 06:31:22 compute-2 sudo[139791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:22 compute-2 sshd-session[139614]: Invalid user test1 from 70.184.117.114 port 46530
Nov 29 06:31:22 compute-2 python3.9[139793]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:22 compute-2 sudo[139791]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:23 compute-2 sshd-session[139614]: error: maximum authentication attempts exceeded for invalid user test1 from 70.184.117.114 port 46530 ssh2 [preauth]
Nov 29 06:31:23 compute-2 sshd-session[139614]: Disconnecting invalid user test1 70.184.117.114 port 46530: Too many authentication failures [preauth]
Nov 29 06:31:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:31:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:23.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:31:23 compute-2 ceph-mon[77142]: pgmap v568: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 19 KiB/s rd, 0 B/s wr, 32 op/s
Nov 29 06:31:23 compute-2 sudo[139945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftkippsvcjufteavfuybmnjfkxrzbfdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397883.3740842-876-95554657185922/AnsiballZ_stat.py'
Nov 29 06:31:23 compute-2 sudo[139945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:23 compute-2 python3.9[139947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:23 compute-2 sudo[139945]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:23 compute-2 sudo[139949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:23 compute-2 sudo[139949]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:23 compute-2 sudo[139949]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:23 compute-2 sshd-session[139818]: Invalid user test1 from 70.184.117.114 port 46542
Nov 29 06:31:23 compute-2 sudo[139975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:23 compute-2 sudo[139975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:23 compute-2 sudo[139975]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:24 compute-2 sudo[140073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jajrfxokwbbnndaiwtdlrlwydbcpjums ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397883.3740842-876-95554657185922/AnsiballZ_file.py'
Nov 29 06:31:24 compute-2 sudo[140073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:24 compute-2 python3.9[140075]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:24 compute-2 sudo[140073]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:24 compute-2 sshd-session[139818]: error: maximum authentication attempts exceeded for invalid user test1 from 70.184.117.114 port 46542 ssh2 [preauth]
Nov 29 06:31:24 compute-2 sshd-session[139818]: Disconnecting invalid user test1 70.184.117.114 port 46542: Too many authentication failures [preauth]
Nov 29 06:31:24 compute-2 sudo[140228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvuxrugcofybrvymrosvqjkmobxbbwjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397884.7148798-913-252577623911160/AnsiballZ_systemd.py'
Nov 29 06:31:24 compute-2 sudo[140228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:31:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:31:25 compute-2 python3.9[140230]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:31:25 compute-2 systemd[1]: Reloading.
Nov 29 06:31:25 compute-2 systemd-rc-local-generator[140256]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:25 compute-2 systemd-sysv-generator[140259]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:25 compute-2 sshd-session[140101]: Invalid user test1 from 70.184.117.114 port 46556
Nov 29 06:31:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:25.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:25 compute-2 systemd[1]: Starting Create netns directory...
Nov 29 06:31:25 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:31:25 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:31:25 compute-2 systemd[1]: Finished Create netns directory.
Nov 29 06:31:25 compute-2 sudo[140228]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:25 compute-2 sshd-session[140101]: Received disconnect from 70.184.117.114 port 46556:11: disconnected by user [preauth]
Nov 29 06:31:25 compute-2 sshd-session[140101]: Disconnected from invalid user test1 70.184.117.114 port 46556 [preauth]
Nov 29 06:31:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:26 compute-2 ceph-mon[77142]: pgmap v569: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 8.2 KiB/s rd, 0 B/s wr, 13 op/s
Nov 29 06:31:26 compute-2 sshd-session[140298]: Invalid user test2 from 70.184.117.114 port 46568
Nov 29 06:31:27 compute-2 sudo[140426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xutxajtjnbopdumudvkbqwknxbmqdoqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397886.7184422-942-56704479840916/AnsiballZ_file.py'
Nov 29 06:31:27 compute-2 sudo[140426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:27.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:27 compute-2 ceph-mon[77142]: pgmap v570: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:27 compute-2 python3.9[140428]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:27 compute-2 sudo[140426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:27 compute-2 sshd-session[140298]: error: maximum authentication attempts exceeded for invalid user test2 from 70.184.117.114 port 46568 ssh2 [preauth]
Nov 29 06:31:27 compute-2 sshd-session[140298]: Disconnecting invalid user test2 70.184.117.114 port 46568: Too many authentication failures [preauth]
Nov 29 06:31:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:27.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:27 compute-2 sudo[140580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uefqtgsndycfnavikbzcyfrebkublkgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397887.4240518-967-277431065694983/AnsiballZ_stat.py'
Nov 29 06:31:27 compute-2 sudo[140580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:27 compute-2 python3.9[140582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:27 compute-2 sudo[140580]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:28 compute-2 sudo[140703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbzvxemsctmqzccxylypozkjmakmlihq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397887.4240518-967-277431065694983/AnsiballZ_copy.py'
Nov 29 06:31:28 compute-2 sudo[140703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:28 compute-2 sshd-session[140477]: Invalid user test2 from 70.184.117.114 port 41888
Nov 29 06:31:28 compute-2 python3.9[140705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397887.4240518-967-277431065694983/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:28 compute-2 sudo[140703]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:28 compute-2 sshd-session[140477]: error: maximum authentication attempts exceeded for invalid user test2 from 70.184.117.114 port 41888 ssh2 [preauth]
Nov 29 06:31:28 compute-2 sshd-session[140477]: Disconnecting invalid user test2 70.184.117.114 port 41888: Too many authentication failures [preauth]
Nov 29 06:31:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:29.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:29 compute-2 sudo[140858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmljfmwalhsmpouwpjynmlumblpzmuwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397889.2154577-1018-280051497825342/AnsiballZ_file.py'
Nov 29 06:31:29 compute-2 sudo[140858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:29.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:29 compute-2 sshd-session[140731]: Invalid user test2 from 70.184.117.114 port 41896
Nov 29 06:31:29 compute-2 python3.9[140860]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:29 compute-2 sudo[140858]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:29 compute-2 sshd-session[140731]: Received disconnect from 70.184.117.114 port 41896:11: disconnected by user [preauth]
Nov 29 06:31:29 compute-2 sshd-session[140731]: Disconnected from invalid user test2 70.184.117.114 port 41896 [preauth]
Nov 29 06:31:29 compute-2 ceph-mon[77142]: pgmap v571: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:30 compute-2 sshd-session[140885]: Invalid user ubuntu from 70.184.117.114 port 41900
Nov 29 06:31:30 compute-2 sudo[141012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lewskehcsonpvbxhnlxmlqenikmbmuwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397890.0096362-1041-170456456243359/AnsiballZ_stat.py'
Nov 29 06:31:30 compute-2 sudo[141012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:30 compute-2 python3.9[141014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:30 compute-2 sudo[141012]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:30 compute-2 sshd-session[140885]: error: maximum authentication attempts exceeded for invalid user ubuntu from 70.184.117.114 port 41900 ssh2 [preauth]
Nov 29 06:31:30 compute-2 sshd-session[140885]: Disconnecting invalid user ubuntu 70.184.117.114 port 41900: Too many authentication failures [preauth]
Nov 29 06:31:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:30 compute-2 sudo[141138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vedipvpubtgcmebqmzbduwufbytmrvvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397890.0096362-1041-170456456243359/AnsiballZ_copy.py'
Nov 29 06:31:30 compute-2 sudo[141138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:31.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:31 compute-2 python3.9[141140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397890.0096362-1041-170456456243359/.source.json _original_basename=.3p2y0425 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:31 compute-2 sudo[141138]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.203539) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891203582, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1375, "num_deletes": 252, "total_data_size": 3197559, "memory_usage": 3240736, "flush_reason": "Manual Compaction"}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891215348, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 2087601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9261, "largest_seqno": 10631, "table_properties": {"data_size": 2081808, "index_size": 3124, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12444, "raw_average_key_size": 19, "raw_value_size": 2069865, "raw_average_value_size": 3254, "num_data_blocks": 144, "num_entries": 636, "num_filter_entries": 636, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397736, "oldest_key_time": 1764397736, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 11845 microseconds, and 4645 cpu microseconds.
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.215386) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 2087601 bytes OK
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.215404) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217213) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217230) EVENT_LOG_v1 {"time_micros": 1764397891217225, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217252) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 3191162, prev total WAL file size 3191162, number of live WAL files 2.
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217928) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(2038KB)], [18(9227KB)]
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891217958, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 11536634, "oldest_snapshot_seqno": -1}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4001 keys, 9547217 bytes, temperature: kUnknown
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891271995, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 9547217, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9515741, "index_size": 20358, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 98601, "raw_average_key_size": 24, "raw_value_size": 9438548, "raw_average_value_size": 2359, "num_data_blocks": 889, "num_entries": 4001, "num_filter_entries": 4001, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.272293) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 9547217 bytes
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.273361) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.2 rd, 176.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 9.0 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(10.1) write-amplify(4.6) OK, records in: 4521, records dropped: 520 output_compression: NoCompression
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.273382) EVENT_LOG_v1 {"time_micros": 1764397891273371, "job": 8, "event": "compaction_finished", "compaction_time_micros": 54121, "compaction_time_cpu_micros": 20949, "output_level": 6, "num_output_files": 1, "total_output_size": 9547217, "num_input_records": 4521, "num_output_records": 4001, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891273856, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891275503, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:31.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:31 compute-2 sshd-session[141110]: Invalid user ubuntu from 70.184.117.114 port 41902
Nov 29 06:31:31 compute-2 ceph-mon[77142]: pgmap v572: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:32 compute-2 sudo[141290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdxdndhgekzafprbfdxtkjztnqqfdigl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397891.9059446-1087-105493104902433/AnsiballZ_file.py'
Nov 29 06:31:32 compute-2 sudo[141290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:32 compute-2 python3.9[141292]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:32 compute-2 sshd-session[141110]: error: maximum authentication attempts exceeded for invalid user ubuntu from 70.184.117.114 port 41902 ssh2 [preauth]
Nov 29 06:31:32 compute-2 sshd-session[141110]: Disconnecting invalid user ubuntu 70.184.117.114 port 41902: Too many authentication failures [preauth]
Nov 29 06:31:32 compute-2 sudo[141290]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:33.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:33 compute-2 sudo[141445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pohyegypmwmtioodnkktxiuostxkdwtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397893.0646443-1110-172581771593851/AnsiballZ_stat.py'
Nov 29 06:31:33 compute-2 sudo[141445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:33 compute-2 sshd-session[141318]: Invalid user ubuntu from 70.184.117.114 port 41904
Nov 29 06:31:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:33.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:33 compute-2 sudo[141445]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:33 compute-2 sshd-session[141318]: Received disconnect from 70.184.117.114 port 41904:11: disconnected by user [preauth]
Nov 29 06:31:33 compute-2 sshd-session[141318]: Disconnected from invalid user ubuntu 70.184.117.114 port 41904 [preauth]
Nov 29 06:31:33 compute-2 sudo[141568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqmqcdookazexqahfnzhvgsjaeauvxhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397893.0646443-1110-172581771593851/AnsiballZ_copy.py'
Nov 29 06:31:33 compute-2 sudo[141568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:34 compute-2 sudo[141568]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:34 compute-2 sshd-session[141570]: Invalid user pi from 70.184.117.114 port 41920
Nov 29 06:31:34 compute-2 sshd-session[141570]: Received disconnect from 70.184.117.114 port 41920:11: disconnected by user [preauth]
Nov 29 06:31:34 compute-2 sshd-session[141570]: Disconnected from invalid user pi 70.184.117.114 port 41920 [preauth]
Nov 29 06:31:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:35.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:35 compute-2 sudo[141725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftzvnjanymsbzaihbkhghsnqikxeponk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397894.6345053-1161-211595204115445/AnsiballZ_container_config_data.py'
Nov 29 06:31:35 compute-2 sudo[141725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:35 compute-2 ceph-mon[77142]: pgmap v573: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:35 compute-2 python3.9[141727]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 29 06:31:35 compute-2 sudo[141725]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:35.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:35 compute-2 sshd-session[141670]: Invalid user baikal from 70.184.117.114 port 41932
Nov 29 06:31:35 compute-2 sshd-session[141670]: Received disconnect from 70.184.117.114 port 41932:11: disconnected by user [preauth]
Nov 29 06:31:35 compute-2 sshd-session[141670]: Disconnected from invalid user baikal 70.184.117.114 port 41932 [preauth]
Nov 29 06:31:36 compute-2 sudo[141877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkzeshzlphwhnffgdwbytxbiyaylbeap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397895.6586783-1188-20228105751305/AnsiballZ_container_config_hash.py'
Nov 29 06:31:36 compute-2 sudo[141877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:36 compute-2 python3.9[141879]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:31:36 compute-2 sudo[141877]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:37.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:37.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:37 compute-2 ceph-mon[77142]: pgmap v574: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:37 compute-2 ceph-mon[77142]: pgmap v575: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:37 compute-2 sudo[142030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ineeyucsrersfrgeiceoqkablrjubpae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397897.5187044-1215-156632209212860/AnsiballZ_podman_container_info.py'
Nov 29 06:31:37 compute-2 sudo[142030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:38 compute-2 python3.9[142032]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:31:38 compute-2 sudo[142030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:39.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:39.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:39 compute-2 sudo[142222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwkozojuvxgspcrfhltkxjtlpbaeyiej ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397899.3284066-1254-146255385682680/AnsiballZ_edpm_container_manage.py'
Nov 29 06:31:39 compute-2 sudo[142222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:39 compute-2 podman[142184]: 2025-11-29 06:31:39.94158813 +0000 UTC m=+0.103599538 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:31:40 compute-2 python3[142228]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:31:40 compute-2 ceph-mon[77142]: pgmap v576: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:31:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:41.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:31:41 compute-2 ceph-mon[77142]: pgmap v577: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:31:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:41.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:31:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:31:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:43.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:31:44 compute-2 ceph-mon[77142]: pgmap v578: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:45.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:45.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:45 compute-2 sudo[142321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:45 compute-2 sudo[142321]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:45 compute-2 sudo[142321]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:45 compute-2 sudo[142346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:45 compute-2 sudo[142346]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:45 compute-2 sudo[142346]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:45 compute-2 sudo[142371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:45 compute-2 sudo[142371]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:45 compute-2 sudo[142371]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:45 compute-2 sudo[142396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:31:45 compute-2 sudo[142396]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:45 compute-2 sudo[142396]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:46 compute-2 ceph-mon[77142]: pgmap v579: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:47.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:49.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:49.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:51.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:51 compute-2 ceph-mon[77142]: pgmap v580: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:51 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:31:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:51.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:51 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:31:52 compute-2 ceph-mon[77142]: pgmap v581: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:52 compute-2 ceph-mon[77142]: pgmap v582: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:53 compute-2 ceph-mon[77142]: pgmap v583: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:53 compute-2 podman[142252]: 2025-11-29 06:31:53.466558882 +0000 UTC m=+13.216320837 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:31:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:53.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:53 compute-2 podman[142479]: 2025-11-29 06:31:53.630403229 +0000 UTC m=+0.052197480 container create b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:31:53 compute-2 podman[142479]: 2025-11-29 06:31:53.599014432 +0000 UTC m=+0.020808713 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:31:53 compute-2 python3[142228]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:31:53 compute-2 sudo[142222]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:55.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:55.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:57.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:57 compute-2 ceph-mon[77142]: pgmap v584: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:57.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:59.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:31:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:59.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:00 compute-2 ceph-mon[77142]: pgmap v585: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:01.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:01.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:03 compute-2 ceph-mon[77142]: pgmap v586: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:03 compute-2 ceph-mon[77142]: pgmap v587: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:32:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:03.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:32:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:03.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:32:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:05.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:32:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:32:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:05.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:32:05 compute-2 ceph-mon[77142]: pgmap v588: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:05 compute-2 sudo[142548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:05 compute-2 sudo[142548]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:05 compute-2 sudo[142548]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:05 compute-2 sudo[142573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:05 compute-2 sudo[142573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:05 compute-2 sudo[142573]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:07.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:07 compute-2 ceph-mon[77142]: pgmap v589: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:07.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:08 compute-2 sudo[142725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnkmwmwuhvzlwjwysgbnjahahtsczbre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397928.401105-1278-68325585642461/AnsiballZ_stat.py'
Nov 29 06:32:08 compute-2 sudo[142725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:08 compute-2 ceph-mon[77142]: pgmap v590: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:08 compute-2 python3.9[142727]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:32:08 compute-2 sudo[142725]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:09.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:09 compute-2 sudo[142879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gypqssompahgihlaagcqyhuxchzhjxni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397929.1903458-1305-271956886420074/AnsiballZ_file.py'
Nov 29 06:32:09 compute-2 sudo[142879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:32:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:09.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:32:09 compute-2 python3.9[142881]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:09 compute-2 sudo[142879]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:09 compute-2 sudo[142955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eawbshcxjjczydzwcbsqxgtggbrkbpim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397929.1903458-1305-271956886420074/AnsiballZ_stat.py'
Nov 29 06:32:09 compute-2 sudo[142955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:10 compute-2 podman[142957]: 2025-11-29 06:32:10.115063383 +0000 UTC m=+0.124357398 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:32:10 compute-2 python3.9[142958]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:32:10 compute-2 sudo[142955]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:10 compute-2 ceph-mon[77142]: pgmap v591: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:10 compute-2 sudo[143133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ornfogxvzimxoyxrtozbuaxoemixvafo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397930.265606-1305-245620988595391/AnsiballZ_copy.py'
Nov 29 06:32:10 compute-2 sudo[143133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:10 compute-2 python3.9[143135]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397930.265606-1305-245620988595391/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:11 compute-2 sudo[143133]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:11.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:11 compute-2 sudo[143209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbdrrnkxxuogqzrymgjhdybhezpcnfad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397930.265606-1305-245620988595391/AnsiballZ_systemd.py'
Nov 29 06:32:11 compute-2 sudo[143209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:11 compute-2 python3.9[143211]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:32:11 compute-2 systemd[1]: Reloading.
Nov 29 06:32:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:11.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:11 compute-2 systemd-sysv-generator[143241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:11 compute-2 systemd-rc-local-generator[143238]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:11 compute-2 sudo[143209]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:12 compute-2 sudo[143321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfnbqsjqfabmvgkxjvvftthvkkxnqlpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397930.265606-1305-245620988595391/AnsiballZ_systemd.py'
Nov 29 06:32:12 compute-2 sudo[143321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:12 compute-2 python3.9[143323]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:12 compute-2 systemd[1]: Reloading.
Nov 29 06:32:12 compute-2 systemd-rc-local-generator[143351]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:12 compute-2 systemd-sysv-generator[143354]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:12 compute-2 systemd[1]: Starting ovn_metadata_agent container...
Nov 29 06:32:12 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:32:12 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e600fbe3e5682b63fdf6b9076df48890b4fc9515c63768d1ddebb9c35046ec3/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 29 06:32:12 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e600fbe3e5682b63fdf6b9076df48890b4fc9515c63768d1ddebb9c35046ec3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:32:12 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108.
Nov 29 06:32:12 compute-2 podman[143365]: 2025-11-29 06:32:12.911425851 +0000 UTC m=+0.131028946 container init b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 29 06:32:12 compute-2 ovn_metadata_agent[143380]: + sudo -E kolla_set_configs
Nov 29 06:32:12 compute-2 podman[143365]: 2025-11-29 06:32:12.957835664 +0000 UTC m=+0.177438759 container start b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:32:12 compute-2 edpm-start-podman-container[143365]: ovn_metadata_agent
Nov 29 06:32:12 compute-2 ceph-mon[77142]: pgmap v592: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:13 compute-2 edpm-start-podman-container[143364]: Creating additional drop-in dependency for "ovn_metadata_agent" (b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108)
Nov 29 06:32:13 compute-2 podman[143386]: 2025-11-29 06:32:13.033413998 +0000 UTC m=+0.061694537 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:32:13 compute-2 systemd[1]: Reloading.
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Validating config file
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Copying service configuration files
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Writing out command to execute
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: ++ cat /run_command
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + CMD=neutron-ovn-metadata-agent
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + ARGS=
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + sudo kolla_copy_cacerts
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + [[ ! -n '' ]]
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + . kolla_extend_start
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: Running command: 'neutron-ovn-metadata-agent'
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + umask 0022
Nov 29 06:32:13 compute-2 ovn_metadata_agent[143380]: + exec neutron-ovn-metadata-agent
Nov 29 06:32:13 compute-2 systemd-rc-local-generator[143454]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:13 compute-2 systemd-sysv-generator[143458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:32:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:13.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:32:13 compute-2 systemd[1]: Started ovn_metadata_agent container.
Nov 29 06:32:13 compute-2 sudo[143321]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:32:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:13.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:32:14 compute-2 ceph-mon[77142]: pgmap v593: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.085 143385 INFO neutron.common.config [-] Logging enabled!
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.086 143385 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.086 143385 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.143 143385 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name fa6f2e5a-176a-4b37-8b2a-5aaf74119c47 (UUID: fa6f2e5a-176a-4b37-8b2a-5aaf74119c47) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 29 06:32:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:15.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.174 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.202 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.300 143385 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'fa6f2e5a-176a-4b37-8b2a-5aaf74119c47'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f7a2ef86760>], external_ids={}, name=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, nb_cfg_timestamp=1764397847534, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.302 143385 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f7a2ef75f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.303 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.303 143385 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.304 143385 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.304 143385 INFO oslo_service.service [-] Starting 1 workers
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.311 143385 DEBUG oslo_service.service [-] Started child 143492 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.315 143492 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2000351'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.318 143385 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpfwloyrd0/privsep.sock']
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.361 143492 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.361 143492 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.361 143492 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.366 143492 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.371 143492 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:32:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.377 143492 INFO eventlet.wsgi.server [-] (143492) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 29 06:32:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:15.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:15 compute-2 ceph-mon[77142]: pgmap v594: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:15 compute-2 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.057 143385 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.058 143385 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfwloyrd0/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.892 143497 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.898 143497 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.906 143497 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.907 143497 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143497
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.061 143497 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6302d5-250e-431f-83c0-e4e736a6dceb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:32:16 compute-2 sshd-session[134061]: Connection closed by 192.168.122.30 port 52788
Nov 29 06:32:16 compute-2 sshd-session[134056]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:32:16 compute-2 systemd[1]: session-47.scope: Deactivated successfully.
Nov 29 06:32:16 compute-2 systemd[1]: session-47.scope: Consumed 58.749s CPU time.
Nov 29 06:32:16 compute-2 systemd-logind[784]: Session 47 logged out. Waiting for processes to exit.
Nov 29 06:32:16 compute-2 systemd-logind[784]: Removed session 47.
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.626 143497 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.626 143497 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:32:16 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.626 143497 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:32:16 compute-2 ceph-mon[77142]: pgmap v595: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:17.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.190 143497 DEBUG oslo.privsep.daemon [-] privsep: reply[c0429e42-80b0-4f46-a43b-31b344e33fec]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.194 143385 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, column=external_ids, values=({'neutron:ovn-metadata-id': '0fe2b91c-e971-5566-838d-5b85755822a8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.203 143385 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.209 143385 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:32:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:17.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:19.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:19.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.004000100s ======
Nov 29 06:32:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:21.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000100s
Nov 29 06:32:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:21.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:22 compute-2 sshd-session[143527]: Accepted publickey for zuul from 192.168.122.30 port 50168 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:32:22 compute-2 systemd-logind[784]: New session 48 of user zuul.
Nov 29 06:32:22 compute-2 systemd[1]: Started Session 48 of User zuul.
Nov 29 06:32:22 compute-2 sshd-session[143527]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:32:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:23.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:23 compute-2 python3.9[143681]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:32:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:23.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:24 compute-2 sudo[143836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tircadhiszduhyktyxaeqvsmnrsuzonj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397944.0722735-70-251367610104428/AnsiballZ_command.py'
Nov 29 06:32:24 compute-2 sudo[143836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:24 compute-2 python3.9[143838]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:32:24 compute-2 sudo[143836]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:25.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:26 compute-2 sudo[143876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:26 compute-2 sudo[143876]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:26 compute-2 sudo[143876]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:26 compute-2 sudo[143901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:26 compute-2 sudo[143901]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:26 compute-2 sudo[143901]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:27.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:27.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:27 compute-2 sudo[144052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trjxwdxxtmqmxmykoukbxlrcjzqudokw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397947.043583-102-40435856804354/AnsiballZ_systemd_service.py'
Nov 29 06:32:27 compute-2 sudo[144052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:28 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:32:29 compute-2 python3.9[144054]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:32:29 compute-2 systemd[1]: Reloading.
Nov 29 06:32:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:29.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:29 compute-2 systemd-sysv-generator[144086]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:29 compute-2 systemd-rc-local-generator[144082]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:29 compute-2 sudo[144052]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:29.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:29 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 06:32:29 compute-2 ceph-mon[77142]: paxos.1).electionLogic(23) init, last seen epoch 23, mid-election, bumping
Nov 29 06:32:29 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:31.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:31.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:32 compute-2 python3.9[144241]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:32:32 compute-2 network[144258]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:32:32 compute-2 network[144259]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:32:32 compute-2 network[144260]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:32:32 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:32:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:33.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:33.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:34 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Nov 29 06:32:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:35 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : overall HEALTH_OK
Nov 29 06:32:35 compute-2 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 06:32:35 compute-2 ceph-mon[77142]: paxos.1).electionLogic(26) init, last seen epoch 26
Nov 29 06:32:35 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:35 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_timecheck drop unexpected msg
Nov 29 06:32:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:35.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:35 compute-2 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:36 compute-2 ceph-mon[77142]: pgmap v596: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:36 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:32:36 compute-2 sudo[144523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmvuhrowsdoveksmmummwjyumkhsnyrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397956.4881608-160-21940969835526/AnsiballZ_systemd_service.py'
Nov 29 06:32:36 compute-2 sudo[144523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:37 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:37.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:37 compute-2 python3.9[144525]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:37 compute-2 sudo[144523]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:37.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:37 compute-2 sudo[144676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmakktkdanwpjksdfmfuoomskcuxhhrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397957.3845315-160-250657242126607/AnsiballZ_systemd_service.py'
Nov 29 06:32:37 compute-2 sudo[144676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:37 compute-2 python3.9[144678]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:38 compute-2 sudo[144676]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:38 compute-2 sudo[144830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvxuxluyvmxaixcuadqvayxqbcbmuiqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397958.1573315-160-138396289364266/AnsiballZ_systemd_service.py'
Nov 29 06:32:38 compute-2 sudo[144830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:38 compute-2 ceph-mon[77142]: pgmap v599: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-2 ceph-mon[77142]: pgmap v600: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-2 ceph-mon[77142]: pgmap v601: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 06:32:38 compute-2 ceph-mon[77142]: pgmap v602: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-2 ceph-mon[77142]: pgmap v603: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-2 ceph-mon[77142]: pgmap v604: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 06:32:38 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:32:38 compute-2 ceph-mon[77142]: pgmap v605: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-2 ceph-mon[77142]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:32:38 compute-2 ceph-mon[77142]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:32:38 compute-2 ceph-mon[77142]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:32:38 compute-2 ceph-mon[77142]: mgrmap e10: compute-0.vxabpq(active, since 15m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:32:38 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:32:38 compute-2 python3.9[144832]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:38 compute-2 sudo[144830]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:39.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:39 compute-2 sudo[144983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqfjsxmqtqiilmyidfdcbruujgjqvkrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397958.9507902-160-51813465871419/AnsiballZ_systemd_service.py'
Nov 29 06:32:39 compute-2 sudo[144983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:39 compute-2 python3.9[144985]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:39.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:39 compute-2 sudo[144983]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:40 compute-2 sudo[145136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhbqwrciefjbgleugboapheylwaczyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397959.7884896-160-159305466917126/AnsiballZ_systemd_service.py'
Nov 29 06:32:40 compute-2 sudo[145136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:40 compute-2 python3.9[145138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:40 compute-2 sudo[145136]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:40 compute-2 podman[145140]: 2025-11-29 06:32:40.476794525 +0000 UTC m=+0.109759728 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:32:40 compute-2 sudo[145316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adkazxyauyshtnjmhipoclgasirtcuui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397960.5121646-160-253507601422908/AnsiballZ_systemd_service.py'
Nov 29 06:32:40 compute-2 sudo[145316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:41 compute-2 python3.9[145318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:41 compute-2 sudo[145316]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:41.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:41 compute-2 ceph-mon[77142]: pgmap v606: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:41 compute-2 sudo[145469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqyaueeyokwcvhvxighxynsgurifpmtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397961.2670703-160-46882808592018/AnsiballZ_systemd_service.py'
Nov 29 06:32:41 compute-2 sudo[145469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:41.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:41 compute-2 python3.9[145471]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:41 compute-2 sudo[145469]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:43.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:43.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:43 compute-2 ceph-mon[77142]: pgmap v607: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:43 compute-2 podman[145498]: 2025-11-29 06:32:43.921915938 +0000 UTC m=+0.069617330 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 06:32:44 compute-2 sudo[145642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyfthtyzowbgsagmropufqtdohogyqxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397963.8564517-316-239007908664752/AnsiballZ_file.py'
Nov 29 06:32:44 compute-2 sudo[145642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:44 compute-2 python3.9[145644]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:44 compute-2 sudo[145642]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:44 compute-2 sudo[145795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvuagrjdpgcfavykknujpwzasohejxpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397964.6460712-316-1909597194575/AnsiballZ_file.py'
Nov 29 06:32:44 compute-2 sudo[145795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:45 compute-2 python3.9[145797]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:45 compute-2 sudo[145795]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:45.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:45 compute-2 sudo[145947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvtiackbganhcufvsicbcgtjjjjdsxay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397965.2764242-316-275157071565344/AnsiballZ_file.py'
Nov 29 06:32:45 compute-2 sudo[145947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:45.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:45 compute-2 python3.9[145949]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:45 compute-2 sudo[145947]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-2 sudo[146049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:46 compute-2 sudo[146049]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-2 sudo[146049]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-2 ceph-mon[77142]: pgmap v608: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:46 compute-2 sudo[146080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:32:46 compute-2 sudo[146080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-2 sudo[146080]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-2 sudo[146172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qujdrqqkztguduqrbgvivlslnbamynvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397965.8742716-316-253752732355150/AnsiballZ_file.py'
Nov 29 06:32:46 compute-2 sudo[146129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:46 compute-2 sudo[146172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:46 compute-2 sudo[146129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-2 sudo[146129]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-2 sudo[146177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:46 compute-2 sudo[146177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-2 sudo[146180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:32:46 compute-2 sudo[146177]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-2 sudo[146180]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-2 sudo[146227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:46 compute-2 sudo[146227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-2 sudo[146227]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-2 python3.9[146175]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:46 compute-2 sudo[146180]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-2 sudo[146172]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:47 compute-2 sudo[146433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlyedipcaqcssfhvtysdaqncmwqjjiko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397966.8185933-316-198917686229667/AnsiballZ_file.py'
Nov 29 06:32:47 compute-2 sudo[146433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:47.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:47 compute-2 python3.9[146435]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:47 compute-2 sudo[146433]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:47.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:47 compute-2 sudo[146585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnnarktzgfskxumozvirlorzghtpxbtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397967.4754202-316-263436146565670/AnsiballZ_file.py'
Nov 29 06:32:47 compute-2 sudo[146585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:47 compute-2 python3.9[146587]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:47 compute-2 sudo[146585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:48 compute-2 sudo[146737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brmjzuqjzoiemwrlmhwzqboxxntwkahl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397968.0804956-316-123402636031650/AnsiballZ_file.py'
Nov 29 06:32:48 compute-2 sudo[146737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:48 compute-2 python3.9[146739]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:48 compute-2 sudo[146737]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:49.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:49.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:51.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:51.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:52 compute-2 sudo[146891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlpoytpdekztdsjhdwezmtmstdevndon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397972.0261579-466-165084689460724/AnsiballZ_file.py'
Nov 29 06:32:52 compute-2 sudo[146891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:52 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:32:52 compute-2 python3.9[146893]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:52 compute-2 sudo[146891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:53 compute-2 sudo[147044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klgisfcvewayoreuhnxevztchaoagigx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397972.7581189-466-233142167946237/AnsiballZ_file.py'
Nov 29 06:32:53 compute-2 sudo[147044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:53 compute-2 python3.9[147046]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:53 compute-2 sudo[147044]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:53.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:53 compute-2 sudo[147196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdershyhoxokymmacbymfajtnrdudgdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397973.3525567-466-35755846714671/AnsiballZ_file.py'
Nov 29 06:32:53 compute-2 sudo[147196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:53 compute-2 python3.9[147198]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:53 compute-2 sudo[147196]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:54 compute-2 sudo[147348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlfyusmnziqoygrhfidpnyzuzwmtuaoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397973.9352093-466-174616213147571/AnsiballZ_file.py'
Nov 29 06:32:54 compute-2 sudo[147348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:54 compute-2 python3.9[147350]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:54 compute-2 sudo[147348]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:54 compute-2 sudo[147501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnzzafqvblehwatbvrazmhiiiccytxbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397974.548409-466-1149363406760/AnsiballZ_file.py'
Nov 29 06:32:54 compute-2 sudo[147501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:55 compute-2 python3.9[147503]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:55 compute-2 sudo[147501]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:55.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:55.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:55 compute-2 sudo[147653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxttjldzyxugaodhjougrhzsgycfrvuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397975.2171886-466-111328535119807/AnsiballZ_file.py'
Nov 29 06:32:55 compute-2 sudo[147653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:55 compute-2 python3.9[147655]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:55 compute-2 sudo[147653]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:55 compute-2 ceph-mon[77142]: pgmap v609: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:56 compute-2 sudo[147805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apfgclrlxdftpefsotdeiosabepobohl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397976.0047581-466-221894507600880/AnsiballZ_file.py'
Nov 29 06:32:56 compute-2 sudo[147805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:56 compute-2 python3.9[147807]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:56 compute-2 sudo[147805]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:57.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:57.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:57 compute-2 ceph-mon[77142]: pgmap v610: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:32:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:32:57 compute-2 ceph-mon[77142]: pgmap v611: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-2 ceph-mon[77142]: pgmap v612: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-2 ceph-mon[77142]: pgmap v613: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-2 ceph-mon[77142]: pgmap v614: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:32:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:32:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:32:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:32:57 compute-2 ceph-mon[77142]: pgmap v615: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:58 compute-2 sudo[147959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsdrvzyhajmihdewrvwkpzwgvnbcpxsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397978.2033517-619-251145524647593/AnsiballZ_command.py'
Nov 29 06:32:58 compute-2 sudo[147959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:58 compute-2 python3.9[147961]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:32:58 compute-2 sudo[147959]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:59.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:32:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:59.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:59 compute-2 python3.9[148113]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:33:01 compute-2 sudo[148266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sujzvrrzqkyorksweefywsexczrhkqbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397980.7176373-672-70163992945951/AnsiballZ_systemd_service.py'
Nov 29 06:33:01 compute-2 sudo[148266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:01.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:01 compute-2 python3.9[148268]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:33:01 compute-2 systemd[1]: Reloading.
Nov 29 06:33:01 compute-2 systemd-rc-local-generator[148292]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:33:01 compute-2 systemd-sysv-generator[148300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:33:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:01.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:01 compute-2 sshd-session[148214]: Invalid user debian from 92.118.39.92 port 60400
Nov 29 06:33:01 compute-2 sudo[148266]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:01 compute-2 sshd-session[148214]: Connection closed by invalid user debian 92.118.39.92 port 60400 [preauth]
Nov 29 06:33:01 compute-2 ceph-mon[77142]: pgmap v616: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:02 compute-2 sudo[148454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjreocygxjbotmgjnbwtykmtpkejzanm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397982.3800159-697-197585013720959/AnsiballZ_command.py'
Nov 29 06:33:02 compute-2 sudo[148454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:02 compute-2 python3.9[148456]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:02 compute-2 sudo[148454]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:03.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:03 compute-2 sudo[148607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhzirlfdnxsdhcaicjchigulrvktaabu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397983.020104-697-56277857498168/AnsiballZ_command.py'
Nov 29 06:33:03 compute-2 sudo[148607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:03 compute-2 python3.9[148609]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:03 compute-2 sudo[148607]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:03.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:03 compute-2 sudo[148760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enmgjcwiyxqnyixrwqrwbfbzoimzpnzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397983.6541648-697-125251215528494/AnsiballZ_command.py'
Nov 29 06:33:03 compute-2 sudo[148760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:04 compute-2 python3.9[148762]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:04 compute-2 sudo[148760]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:04 compute-2 ceph-mon[77142]: pgmap v617: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:04 compute-2 ceph-mon[77142]: pgmap v618: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:04 compute-2 sudo[148914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbeawkkovxbjjpxxzwoarlfpqoyzyjxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397984.291923-697-68104621650959/AnsiballZ_command.py'
Nov 29 06:33:04 compute-2 sudo[148914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:04 compute-2 python3.9[148916]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:04 compute-2 sudo[148914]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:05 compute-2 sudo[149067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkanfbcrqtdiojjsdnynyjwedgunctok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397984.8855052-697-160633553605918/AnsiballZ_command.py'
Nov 29 06:33:05 compute-2 sudo[149067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:05.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:05 compute-2 python3.9[149069]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:05 compute-2 sudo[149067]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:05.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:05 compute-2 sudo[149220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbtgpvjpkurgqokdizyyqypavaehpmfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397985.5326812-697-214079008562003/AnsiballZ_command.py'
Nov 29 06:33:05 compute-2 sudo[149220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:06 compute-2 python3.9[149222]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:06 compute-2 sudo[149220]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:06 compute-2 sudo[149323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:06 compute-2 sudo[149323]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:06 compute-2 sudo[149323]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:06 compute-2 sudo[149369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:06 compute-2 sudo[149369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:06 compute-2 sudo[149369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:06 compute-2 sudo[149424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqtwnctrhcrrhhhzgzrzkradxdlhpqko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397986.2001574-697-281250530194073/AnsiballZ_command.py'
Nov 29 06:33:06 compute-2 sudo[149424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:06 compute-2 ceph-mon[77142]: pgmap v619: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:06 compute-2 python3.9[149426]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:06 compute-2 sudo[149424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:07.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:07.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:08 compute-2 ceph-mon[77142]: pgmap v620: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:09 compute-2 sudo[149578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddcdvakxuemoularcaccgtwsavjzvut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397988.8297486-859-7714404520030/AnsiballZ_getent.py'
Nov 29 06:33:09 compute-2 sudo[149578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:09.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:09 compute-2 python3.9[149580]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 29 06:33:09 compute-2 sudo[149578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:09.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:10 compute-2 ceph-mon[77142]: pgmap v621: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:10 compute-2 sudo[149731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfbfbqbjqzlviiyvkfsujhgqrbvaavez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397989.747473-882-261881629133673/AnsiballZ_group.py'
Nov 29 06:33:10 compute-2 sudo[149731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:10 compute-2 python3.9[149733]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:33:10 compute-2 groupadd[149734]: group added to /etc/group: name=libvirt, GID=42473
Nov 29 06:33:10 compute-2 groupadd[149734]: group added to /etc/gshadow: name=libvirt
Nov 29 06:33:10 compute-2 groupadd[149734]: new group: name=libvirt, GID=42473
Nov 29 06:33:10 compute-2 sudo[149731]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:10 compute-2 podman[149765]: 2025-11-29 06:33:10.941697039 +0000 UTC m=+0.095433833 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 06:33:11 compute-2 sudo[149792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:11 compute-2 sudo[149792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:11 compute-2 sudo[149792]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:11 compute-2 sudo[149840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:33:11 compute-2 sudo[149840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:11 compute-2 sudo[149840]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:11.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:33:11 compute-2 ceph-mon[77142]: pgmap v622: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:33:11 compute-2 sudo[149967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eilxgchxsplkdacaodwhtcsadutbdxvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397991.112153-907-63084183089383/AnsiballZ_user.py'
Nov 29 06:33:11 compute-2 sudo[149967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:11.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:11 compute-2 python3.9[149969]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:33:12 compute-2 useradd[149971]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:33:12 compute-2 sudo[149967]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:13.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:13.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:13 compute-2 ceph-mon[77142]: pgmap v623: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:14 compute-2 podman[150004]: 2025-11-29 06:33:14.879670973 +0000 UTC m=+0.046759036 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 06:33:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:33:15.122 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:33:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:33:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:33:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:33:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:33:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:15.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:15.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:15 compute-2 ceph-mon[77142]: pgmap v624: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:16 compute-2 sudo[150149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xswyqbjwdfiohsfotxcfettlhltoefqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397995.7653785-940-114226683053760/AnsiballZ_setup.py'
Nov 29 06:33:16 compute-2 sudo[150149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:16 compute-2 python3.9[150151]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:33:16 compute-2 sudo[150149]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:16 compute-2 sudo[150234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anlubxgpadqeztlfinntvhdqlaohjsuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397995.7653785-940-114226683053760/AnsiballZ_dnf.py'
Nov 29 06:33:16 compute-2 sudo[150234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:17 compute-2 python3.9[150236]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:33:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:17.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:17.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:18 compute-2 ceph-mon[77142]: pgmap v625: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:19 compute-2 ceph-mon[77142]: pgmap v626: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:19.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:19.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:21.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:21 compute-2 ceph-mon[77142]: pgmap v627: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:21.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:23.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:23.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:24 compute-2 ceph-mon[77142]: pgmap v628: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:25.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:25.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:25 compute-2 ceph-mon[77142]: pgmap v629: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:26 compute-2 sudo[150399]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:26 compute-2 sudo[150399]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:26 compute-2 sudo[150399]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:26 compute-2 sudo[150427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:26 compute-2 sudo[150427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:26 compute-2 sudo[150427]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:27.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:27 compute-2 ceph-mon[77142]: pgmap v630: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:27.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:29.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:29.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:29 compute-2 ceph-mon[77142]: pgmap v631: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:31.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:31 compute-2 ceph-mon[77142]: pgmap v632: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:31.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:33.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:33.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:34 compute-2 ceph-mon[77142]: pgmap v633: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:35.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:35 compute-2 ceph-mon[77142]: pgmap v634: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:35.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:37.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:37.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:37 compute-2 ceph-mon[77142]: pgmap v635: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:38 compute-2 ceph-mon[77142]: pgmap v636: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:39.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:39.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:41.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:41.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:41 compute-2 podman[150496]: 2025-11-29 06:33:41.962700671 +0000 UTC m=+0.115386101 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 06:33:42 compute-2 ceph-mon[77142]: pgmap v637: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:42 compute-2 kernel: SELinux:  Converting 2771 SID table entries...
Nov 29 06:33:42 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:33:42 compute-2 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:33:42 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:33:42 compute-2 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:33:42 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:33:42 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:33:42 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:33:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:43.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:43 compute-2 ceph-mon[77142]: pgmap v638: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:43.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:45.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:45.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:45 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 29 06:33:45 compute-2 podman[150529]: 2025-11-29 06:33:45.902052649 +0000 UTC m=+0.060203448 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:33:46 compute-2 ceph-mon[77142]: pgmap v639: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:46 compute-2 sudo[150550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:46 compute-2 sudo[150550]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:46 compute-2 sudo[150550]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:46 compute-2 sudo[150575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:46 compute-2 sudo[150575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:46 compute-2 sudo[150575]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:47.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:47 compute-2 ceph-mon[77142]: pgmap v640: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:47.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:49.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:49.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:49 compute-2 ceph-mon[77142]: pgmap v641: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:51 compute-2 ceph-mon[77142]: pgmap v642: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:51.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:51 compute-2 kernel: SELinux:  Converting 2771 SID table entries...
Nov 29 06:33:51 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:33:51 compute-2 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:33:51 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:33:51 compute-2 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:33:51 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:33:51 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:33:51 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:33:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:51.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:53.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:53 compute-2 ceph-mon[77142]: pgmap v643: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:53.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:55.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:55.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:55 compute-2 ceph-mon[77142]: pgmap v644: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:57.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:57.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:58 compute-2 ceph-mon[77142]: pgmap v645: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:59 compute-2 ceph-mon[77142]: pgmap v646: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:59.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:33:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:59.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:01.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:01 compute-2 ceph-mon[77142]: pgmap v647: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:01.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:03.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:03 compute-2 ceph-mon[77142]: pgmap v648: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:03.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:05.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:05 compute-2 ceph-mon[77142]: pgmap v649: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:05.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:06 compute-2 sudo[152577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:06 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 29 06:34:06 compute-2 sudo[152577]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:06 compute-2 sudo[152577]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:06 compute-2 sudo[152652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:06 compute-2 sudo[152652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:06 compute-2 sudo[152652]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:07.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:07.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:09.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:09 compute-2 ceph-mon[77142]: pgmap v650: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:09.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:11 compute-2 ceph-mon[77142]: pgmap v651: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:11 compute-2 sudo[155714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:11 compute-2 sudo[155714]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-2 sudo[155714]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:11 compute-2 sudo[155781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:34:11 compute-2 sudo[155781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-2 sudo[155781]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:11.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:11 compute-2 sudo[155843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:11 compute-2 sudo[155843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-2 sudo[155843]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:11 compute-2 sudo[155910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:34:11 compute-2 sudo[155910]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:11.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:11 compute-2 sudo[155910]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:12 compute-2 ceph-mon[77142]: pgmap v652: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:12 compute-2 podman[156984]: 2025-11-29 06:34:12.982834668 +0000 UTC m=+0.111412824 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 06:34:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:34:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:34:13 compute-2 ceph-mon[77142]: pgmap v653: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:34:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:34:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:34:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:34:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:13.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:13.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:34:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:34:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:34:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:34:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:34:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:34:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:15.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:15.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:16 compute-2 ceph-mon[77142]: pgmap v654: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:16 compute-2 podman[159745]: 2025-11-29 06:34:16.895845641 +0000 UTC m=+0.054923026 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 06:34:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:17.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:17.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:18 compute-2 ceph-mon[77142]: pgmap v655: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:19.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:19.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:20 compute-2 ceph-mon[77142]: pgmap v656: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:21.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:21 compute-2 ceph-mon[77142]: pgmap v657: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:21.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:23.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:23.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:25.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:27 compute-2 sudo[166389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:27 compute-2 sudo[166389]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:27 compute-2 sudo[166389]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:27 compute-2 sudo[166459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:27 compute-2 sudo[166459]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:27 compute-2 sudo[166459]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:27.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:28 compute-2 ceph-mon[77142]: pgmap v658: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:29.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:29.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:31 compute-2 ceph-mon[77142]: pgmap v659: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:31 compute-2 ceph-mon[77142]: pgmap v660: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:31 compute-2 ceph-mon[77142]: pgmap v661: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:31.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:31.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:33.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:33.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:34 compute-2 ceph-mon[77142]: pgmap v662: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:35.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:35.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:37.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:37 compute-2 ceph-mon[77142]: pgmap v663: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:37 compute-2 ceph-mon[77142]: pgmap v664: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:37.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:39.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:39.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:40 compute-2 ceph-mon[77142]: pgmap v665: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:41.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:41.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:42 compute-2 ceph-mon[77142]: pgmap v666: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:43 compute-2 sudo[167712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:43 compute-2 sudo[167712]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:43 compute-2 sudo[167712]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:43 compute-2 sudo[167743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:34:43 compute-2 sudo[167743]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:43 compute-2 sudo[167743]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:43 compute-2 podman[167736]: 2025-11-29 06:34:43.153683929 +0000 UTC m=+0.095284719 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 06:34:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:43.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:43.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:44 compute-2 ceph-mon[77142]: pgmap v667: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:44 compute-2 ceph-mon[77142]: pgmap v668: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:44 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:34:44 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:34:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:45.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:45.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:47 compute-2 sudo[167794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:47 compute-2 sudo[167794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:47 compute-2 sudo[167794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:47 compute-2 podman[167818]: 2025-11-29 06:34:47.27986255 +0000 UTC m=+0.047027803 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:34:47 compute-2 sudo[167827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:47 compute-2 sudo[167827]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:47 compute-2 sudo[167827]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:47 compute-2 kernel: SELinux:  Converting 2772 SID table entries...
Nov 29 06:34:47 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:34:47 compute-2 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:34:47 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:34:47 compute-2 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:34:47 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:34:47 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:34:47 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:34:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:47.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:47.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:48 compute-2 ceph-mon[77142]: pgmap v669: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:48 compute-2 groupadd[167871]: group added to /etc/group: name=dnsmasq, GID=992
Nov 29 06:34:48 compute-2 groupadd[167871]: group added to /etc/gshadow: name=dnsmasq
Nov 29 06:34:48 compute-2 groupadd[167871]: new group: name=dnsmasq, GID=992
Nov 29 06:34:48 compute-2 useradd[167878]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 29 06:34:48 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 06:34:48 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 29 06:34:48 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 06:34:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:49.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:49 compute-2 ceph-mon[77142]: pgmap v670: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:49 compute-2 ceph-mon[77142]: pgmap v671: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:49 compute-2 groupadd[167891]: group added to /etc/group: name=clevis, GID=991
Nov 29 06:34:49 compute-2 groupadd[167891]: group added to /etc/gshadow: name=clevis
Nov 29 06:34:49 compute-2 groupadd[167891]: new group: name=clevis, GID=991
Nov 29 06:34:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:49.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:49 compute-2 useradd[167898]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 29 06:34:49 compute-2 usermod[167908]: add 'clevis' to group 'tss'
Nov 29 06:34:49 compute-2 usermod[167908]: add 'clevis' to shadow group 'tss'
Nov 29 06:34:51 compute-2 ceph-mon[77142]: pgmap v672: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:51.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:51.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:52 compute-2 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:34:52 compute-2 polkitd[43476]: Reloading rules
Nov 29 06:34:52 compute-2 polkitd[43476]: Collecting garbage unconditionally...
Nov 29 06:34:52 compute-2 polkitd[43476]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:34:52 compute-2 polkitd[43476]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:34:52 compute-2 polkitd[43476]: Finished loading, compiling and executing 3 rules
Nov 29 06:34:52 compute-2 polkitd[43476]: Reloading rules
Nov 29 06:34:52 compute-2 polkitd[43476]: Collecting garbage unconditionally...
Nov 29 06:34:52 compute-2 polkitd[43476]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:34:52 compute-2 polkitd[43476]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:34:52 compute-2 polkitd[43476]: Finished loading, compiling and executing 3 rules
Nov 29 06:34:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:53.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:53 compute-2 groupadd[168097]: group added to /etc/group: name=ceph, GID=167
Nov 29 06:34:53 compute-2 groupadd[168097]: group added to /etc/gshadow: name=ceph
Nov 29 06:34:53 compute-2 groupadd[168097]: new group: name=ceph, GID=167
Nov 29 06:34:53 compute-2 useradd[168103]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 29 06:34:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:53.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:53 compute-2 ceph-mon[77142]: pgmap v673: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:55 compute-2 ceph-mon[77142]: pgmap v674: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:55.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:55.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:56 compute-2 systemd[1]: Stopping OpenSSH server daemon...
Nov 29 06:34:56 compute-2 sshd[1004]: Received signal 15; terminating.
Nov 29 06:34:56 compute-2 systemd[1]: sshd.service: Deactivated successfully.
Nov 29 06:34:56 compute-2 systemd[1]: Stopped OpenSSH server daemon.
Nov 29 06:34:56 compute-2 systemd[1]: sshd.service: Consumed 4.156s CPU time, read 32.0K from disk, written 56.0K to disk.
Nov 29 06:34:56 compute-2 systemd[1]: Stopped target sshd-keygen.target.
Nov 29 06:34:56 compute-2 systemd[1]: Stopping sshd-keygen.target...
Nov 29 06:34:56 compute-2 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:34:56 compute-2 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:34:56 compute-2 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:34:56 compute-2 systemd[1]: Reached target sshd-keygen.target.
Nov 29 06:34:56 compute-2 systemd[1]: Starting OpenSSH server daemon...
Nov 29 06:34:56 compute-2 sshd[168730]: Server listening on 0.0.0.0 port 22.
Nov 29 06:34:56 compute-2 sshd[168730]: Server listening on :: port 22.
Nov 29 06:34:56 compute-2 systemd[1]: Started OpenSSH server daemon.
Nov 29 06:34:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:57.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:57.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:58 compute-2 ceph-mon[77142]: pgmap v675: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:58 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:34:59 compute-2 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:34:59 compute-2 systemd[1]: Reloading.
Nov 29 06:34:59 compute-2 systemd-rc-local-generator[168984]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:34:59 compute-2 systemd-sysv-generator[168989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:34:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:59.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:59 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:34:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:34:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:59.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:01.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:01.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:02 compute-2 ceph-mon[77142]: pgmap v676: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:03 compute-2 sudo[150234]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:03.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:03 compute-2 ceph-mon[77142]: pgmap v677: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:03 compute-2 ceph-mon[77142]: pgmap v678: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:03.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:05.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:06 compute-2 ceph-mon[77142]: pgmap v679: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:07 compute-2 sudo[175160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:07 compute-2 sudo[175160]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:07 compute-2 sudo[175160]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:07.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:07 compute-2 sudo[175261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:07 compute-2 sudo[175261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:07 compute-2 sudo[175261]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:07.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:09 compute-2 ceph-mon[77142]: pgmap v680: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:09.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:09.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:09 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:35:09 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:35:09 compute-2 systemd[1]: man-db-cache-update.service: Consumed 10.421s CPU time.
Nov 29 06:35:09 compute-2 systemd[1]: run-rb3ab222678514015b742f487412dcd14.service: Deactivated successfully.
Nov 29 06:35:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:11.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:11.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:11 compute-2 ceph-mon[77142]: pgmap v681: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:12 compute-2 sshd-session[177458]: Invalid user support from 92.118.39.92 port 53810
Nov 29 06:35:12 compute-2 sshd-session[177458]: Connection closed by invalid user support 92.118.39.92 port 53810 [preauth]
Nov 29 06:35:13 compute-2 ceph-mon[77142]: pgmap v682: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:13.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:13.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:13 compute-2 podman[177461]: 2025-11-29 06:35:13.948211909 +0000 UTC m=+0.110485387 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 06:35:14 compute-2 ceph-mon[77142]: pgmap v683: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:35:15.125 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:35:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:35:15.126 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:35:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:35:15.126 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:35:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:15.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:15 compute-2 ceph-mon[77142]: pgmap v684: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:15.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:17.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:17.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:17 compute-2 podman[177490]: 2025-11-29 06:35:17.875551673 +0000 UTC m=+0.042081150 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 06:35:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:19.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:19.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:21.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:21.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:22 compute-2 ceph-mon[77142]: pgmap v685: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:23.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:23 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:35:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:23.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:25 compute-2 ceph-mon[77142]: pgmap v686: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:25 compute-2 ceph-mon[77142]: pgmap v687: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:25 compute-2 ceph-mon[77142]: pgmap v688: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:25.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:25.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:26 compute-2 ceph-mon[77142]: pgmap v689: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:27.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:27 compute-2 sudo[177514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:27 compute-2 sudo[177514]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:27 compute-2 sudo[177514]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:27 compute-2 sudo[177539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:27 compute-2 sudo[177539]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:27 compute-2 sudo[177539]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:27.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:27 compute-2 ceph-mon[77142]: pgmap v690: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:29.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:29.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:30 compute-2 ceph-mon[77142]: pgmap v691: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:31.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:31 compute-2 ceph-mon[77142]: pgmap v692: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:31.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:33.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:33.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:34 compute-2 ceph-mon[77142]: pgmap v693: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:35.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:35.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:36 compute-2 ceph-mon[77142]: pgmap v694: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:37.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:37.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:39 compute-2 ceph-mon[77142]: pgmap v695: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:39.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:35:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:39.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:35:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:41.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:41.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:43 compute-2 sudo[177573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:43 compute-2 sudo[177573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-2 sudo[177573]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:43 compute-2 sudo[177598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:35:43 compute-2 sudo[177598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-2 sudo[177598]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:43 compute-2 sudo[177623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:43 compute-2 sudo[177623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-2 sudo[177623]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:43 compute-2 sudo[177648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:35:43 compute-2 sudo[177648]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:43.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:43.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:43 compute-2 sudo[177648]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:44 compute-2 ceph-mon[77142]: pgmap v696: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:44 compute-2 podman[177704]: 2025-11-29 06:35:44.968534847 +0000 UTC m=+0.122484498 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:35:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:45.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:45.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:47.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:47 compute-2 sudo[177732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:47 compute-2 sudo[177732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:47 compute-2 sudo[177732]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:47.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:47 compute-2 sudo[177757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:47 compute-2 sudo[177757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:47 compute-2 sudo[177757]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:48 compute-2 ceph-mon[77142]: pgmap v697: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:48 compute-2 ceph-mon[77142]: pgmap v698: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:35:48 compute-2 podman[177783]: 2025-11-29 06:35:48.879629578 +0000 UTC m=+0.047428443 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 06:35:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:49.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:49.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:35:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:51.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:35:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:51.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:35:51 compute-2 ceph-mon[77142]: pgmap v699: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:35:51 compute-2 ceph-mon[77142]: pgmap v700: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-2 ceph-mon[77142]: pgmap v701: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-2 ceph-mon[77142]: pgmap v702: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:35:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:35:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:53 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:35:53 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:35:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:53.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:35:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:53.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:55.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:55.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:35:56 compute-2 ceph-mon[77142]: pgmap v703: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:57 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:57.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:35:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:57.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:35:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:59.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:59 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:35:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:35:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:35:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:59.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:01.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:36:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:01.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:36:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:36:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:36:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:03.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:04 compute-2 ceph-mon[77142]: pgmap v704: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:04 compute-2 ceph-mon[77142]: pgmap v705: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:04 compute-2 ceph-mon[77142]: pgmap v706: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:04 compute-2 ceph-mon[77142]: pgmap v707: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:05.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:05.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:07 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:36:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:07.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:36:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:07.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:07 compute-2 sudo[177812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:07 compute-2 sudo[177812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:07 compute-2 sudo[177812]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:07 compute-2 sudo[177837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:07 compute-2 sudo[177837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:08 compute-2 sudo[177837]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:09.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:09.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:11 compute-2 ceph-mon[77142]: pgmap v708: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:11 compute-2 ceph-mon[77142]: pgmap v709: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:11.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:11.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:12 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:13.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:13.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:13 compute-2 ceph-mon[77142]: pgmap v710: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:13 compute-2 ceph-mon[77142]: pgmap v711: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:13 compute-2 ceph-mon[77142]: pgmap v712: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:36:15.126 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:36:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:36:15.127 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:36:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:36:15.127 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:36:15 compute-2 ceph-mon[77142]: pgmap v713: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:15.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:15.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:15 compute-2 podman[177866]: 2025-11-29 06:36:15.965084758 +0000 UTC m=+0.113858617 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 06:36:17 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:17.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:17.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:18 compute-2 ceph-mon[77142]: pgmap v714: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:19.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:19.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:19 compute-2 podman[177892]: 2025-11-29 06:36:19.922264559 +0000 UTC m=+0.079934259 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 06:36:20 compute-2 ceph-mon[77142]: pgmap v715: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:20 compute-2 ceph-mon[77142]: pgmap v716: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:21.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:36:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:21.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:36:22 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:23 compute-2 sshd-session[143530]: Received disconnect from 192.168.122.30 port 50168:11: disconnected by user
Nov 29 06:36:23 compute-2 sshd-session[143530]: Disconnected from user zuul 192.168.122.30 port 50168
Nov 29 06:36:23 compute-2 sshd-session[143527]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:36:23 compute-2 systemd[1]: session-48.scope: Deactivated successfully.
Nov 29 06:36:23 compute-2 systemd[1]: session-48.scope: Consumed 1min 54.847s CPU time.
Nov 29 06:36:23 compute-2 systemd-logind[784]: Session 48 logged out. Waiting for processes to exit.
Nov 29 06:36:23 compute-2 systemd-logind[784]: Removed session 48.
Nov 29 06:36:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:23.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:23.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:36:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:25.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:36:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 06:36:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:25.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 06:36:27 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000046s ======
Nov 29 06:36:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:27.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000046s
Nov 29 06:36:27 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:36:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:27.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:28 compute-2 sudo[177917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:28 compute-2 sudo[177917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:28 compute-2 sudo[177917]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:28 compute-2 sudo[177942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:28 compute-2 sudo[177942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:28 compute-2 sudo[177942]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:29.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:36:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:29.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:36:29 compute-2 ceph-mon[77142]: pgmap v717: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:31.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:31.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:32 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:33 compute-2 ceph-mon[77142]: pgmap v718: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-2 ceph-mon[77142]: pgmap v719: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-2 ceph-mon[77142]: pgmap v720: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-2 ceph-mon[77142]: pgmap v721: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-2 ceph-mon[77142]: pgmap v722: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:33.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:36:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:33.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:36:34 compute-2 ceph-mon[77142]: pgmap v723: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:35.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:37 compute-2 ceph-mon[77142]: pgmap v724: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:37.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:37.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:37 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:39.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:39.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:40 compute-2 sshd-session[177973]: Accepted publickey for zuul from 192.168.122.30 port 59886 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:36:40 compute-2 systemd-logind[784]: New session 49 of user zuul.
Nov 29 06:36:40 compute-2 systemd[1]: Started Session 49 of User zuul.
Nov 29 06:36:40 compute-2 sshd-session[177973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:36:40 compute-2 ceph-mon[77142]: pgmap v725: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:40 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:36:40 compute-2 sudo[178103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzvxeancvmxcdeaqpyzxjeuhndkotwvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398200.4601157-976-202787189435614/AnsiballZ_systemd.py'
Nov 29 06:36:40 compute-2 sudo[178103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:41 compute-2 sudo[178106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:41 compute-2 sudo[178106]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:41 compute-2 sudo[178106]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:41 compute-2 python3.9[178105]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:41 compute-2 sudo[178131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:36:41 compute-2 sudo[178131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:41 compute-2 sudo[178131]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:41 compute-2 systemd[1]: Reloading.
Nov 29 06:36:41 compute-2 systemd-rc-local-generator[178184]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:41 compute-2 systemd-sysv-generator[178188]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:41 compute-2 sudo[178103]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:36:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:41.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:36:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:41.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:41 compute-2 ceph-mon[77142]: pgmap v726: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:41 compute-2 ceph-mon[77142]: pgmap v727: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:41 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:36:41 compute-2 sudo[178343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnwnftygzxuupfvhaioniclpvpcoondw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398201.597581-976-13886394811035/AnsiballZ_systemd.py'
Nov 29 06:36:41 compute-2 sudo[178343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:42 compute-2 python3.9[178345]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:42 compute-2 systemd[1]: Reloading.
Nov 29 06:36:42 compute-2 systemd-sysv-generator[178381]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:42 compute-2 systemd-rc-local-generator[178377]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:42 compute-2 sudo[178343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:42 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:43 compute-2 sudo[178535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpkhuauyqtmplzroadrcovbpzfenftlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398202.8214152-976-774896630282/AnsiballZ_systemd.py'
Nov 29 06:36:43 compute-2 sudo[178535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:43 compute-2 python3.9[178537]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:43.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:43.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:43 compute-2 ceph-mon[77142]: pgmap v728: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:44 compute-2 systemd[1]: Reloading.
Nov 29 06:36:44 compute-2 systemd-rc-local-generator[178567]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:44 compute-2 systemd-sysv-generator[178571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:44 compute-2 sudo[178535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:45 compute-2 sudo[178726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkkcohuvcmttlltnsavimqhafqginvyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398204.972173-976-73985087597093/AnsiballZ_systemd.py'
Nov 29 06:36:45 compute-2 sudo[178726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:45.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:45 compute-2 python3.9[178728]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:45 compute-2 systemd[1]: Reloading.
Nov 29 06:36:45 compute-2 systemd-rc-local-generator[178760]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:45 compute-2 systemd-sysv-generator[178763]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:45.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:45 compute-2 ceph-mon[77142]: pgmap v729: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:45 compute-2 sudo[178726]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:46 compute-2 podman[178793]: 2025-11-29 06:36:46.942973959 +0000 UTC m=+0.094992141 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:36:47 compute-2 ceph-mon[77142]: pgmap v730: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:47 compute-2 sudo[178945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqzfhrdkqdkcxqbuammwuyqrulzuavzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398206.8966022-1065-153702858329029/AnsiballZ_systemd.py'
Nov 29 06:36:47 compute-2 sudo[178945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:47 compute-2 python3.9[178947]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:47 compute-2 systemd[1]: Reloading.
Nov 29 06:36:47 compute-2 systemd-sysv-generator[178980]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:47 compute-2 systemd-rc-local-generator[178973]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:47.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:47 compute-2 sudo[178945]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:48 compute-2 sudo[179084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:48 compute-2 sudo[179084]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:48 compute-2 sudo[179084]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:48 compute-2 sudo[179133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:48 compute-2 sudo[179133]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:48 compute-2 sudo[179133]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:48 compute-2 sudo[179184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqswnxyrlncoyrxmmtzazknxszgxdeeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398208.0780382-1065-173332964009992/AnsiballZ_systemd.py'
Nov 29 06:36:48 compute-2 sudo[179184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:48 compute-2 python3.9[179186]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:48 compute-2 systemd[1]: Reloading.
Nov 29 06:36:48 compute-2 systemd-sysv-generator[179220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:48 compute-2 systemd-rc-local-generator[179214]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:49 compute-2 sudo[179184]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:49 compute-2 sudo[179375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-benjxmgnxdaujhmqusiyuaqfgmffrfqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398209.1733367-1065-65232790536395/AnsiballZ_systemd.py'
Nov 29 06:36:49 compute-2 sudo[179375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:49.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:49 compute-2 python3.9[179377]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:49 compute-2 systemd[1]: Reloading.
Nov 29 06:36:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:49.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:49 compute-2 systemd-rc-local-generator[179407]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:49 compute-2 systemd-sysv-generator[179411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:50 compute-2 ceph-mon[77142]: pgmap v731: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:50 compute-2 sudo[179375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:50 compute-2 podman[179416]: 2025-11-29 06:36:50.163892602 +0000 UTC m=+0.053973033 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 06:36:50 compute-2 sudo[179585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pohdwfltyvcqkjmwtoulfdtcqjfbgwoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398210.2358415-1065-228707438904370/AnsiballZ_systemd.py'
Nov 29 06:36:50 compute-2 sudo[179585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:50 compute-2 python3.9[179587]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:50 compute-2 sudo[179585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:51 compute-2 ceph-mon[77142]: pgmap v732: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:51 compute-2 sudo[179740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbxdtnwjiglgkzneccvazoxlprzcpowb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398211.0142412-1065-66454211819855/AnsiballZ_systemd.py'
Nov 29 06:36:51 compute-2 sudo[179740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:51.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:51 compute-2 python3.9[179742]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:51 compute-2 systemd[1]: Reloading.
Nov 29 06:36:51 compute-2 systemd-rc-local-generator[179767]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:51 compute-2 systemd-sysv-generator[179771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:51.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:51 compute-2 sudo[179740]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:53 compute-2 sudo[179931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osbagvzvzhzqrroulxdqbzltgehfprzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398213.258094-1173-9301828339342/AnsiballZ_systemd.py'
Nov 29 06:36:53 compute-2 sudo[179931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:36:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:53.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:36:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:53.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:53 compute-2 python3.9[179933]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:53 compute-2 systemd[1]: Reloading.
Nov 29 06:36:53 compute-2 systemd-rc-local-generator[179960]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:53 compute-2 systemd-sysv-generator[179964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:55.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:55.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:56 compute-2 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 29 06:36:56 compute-2 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 29 06:36:56 compute-2 sudo[179931]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:56 compute-2 ceph-mon[77142]: pgmap v733: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:56 compute-2 sudo[180127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxdqgirlwgdbqzeurtlzebgjevicrwkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398216.3672338-1197-104270335306058/AnsiballZ_systemd.py'
Nov 29 06:36:56 compute-2 sudo[180127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:56 compute-2 python3.9[180129]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:57 compute-2 sudo[180127]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:57 compute-2 ceph-mon[77142]: pgmap v734: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:57 compute-2 ceph-mon[77142]: pgmap v735: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:57 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 29 06:36:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:57.476897) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:36:57 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 29 06:36:57 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217476961, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2443, "num_deletes": 251, "total_data_size": 6349708, "memory_usage": 6431728, "flush_reason": "Manual Compaction"}
Nov 29 06:36:57 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 29 06:36:57 compute-2 sudo[180282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmcsixevpupckccpxnqotitrpasemthh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398217.2181664-1197-265366826739492/AnsiballZ_systemd.py'
Nov 29 06:36:57 compute-2 sudo[180282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:57.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:57 compute-2 python3.9[180284]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:57.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:57 compute-2 sudo[180282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218109138, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 4155680, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10636, "largest_seqno": 13074, "table_properties": {"data_size": 4145762, "index_size": 6348, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19650, "raw_average_key_size": 20, "raw_value_size": 4125989, "raw_average_value_size": 4205, "num_data_blocks": 284, "num_entries": 981, "num_filter_entries": 981, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397891, "oldest_key_time": 1764397891, "file_creation_time": 1764398217, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 632333 microseconds, and 9962 cpu microseconds.
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.109223) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 4155680 bytes OK
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.109258) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.113621) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.113673) EVENT_LOG_v1 {"time_micros": 1764398218113660, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.113705) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6339244, prev total WAL file size 6339244, number of live WAL files 2.
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:36:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.115852) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(4058KB)], [21(9323KB)]
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218115921, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 13702897, "oldest_snapshot_seqno": -1}
Nov 29 06:36:58 compute-2 sudo[180437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eabkrrndihevpuhpoqwsjmasyakgteyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398218.045936-1197-148465338922003/AnsiballZ_systemd.py'
Nov 29 06:36:58 compute-2 sudo[180437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4464 keys, 10609735 bytes, temperature: kUnknown
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218425311, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 10609735, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10574934, "index_size": 22531, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11205, "raw_key_size": 109201, "raw_average_key_size": 24, "raw_value_size": 10489328, "raw_average_value_size": 2349, "num_data_blocks": 972, "num_entries": 4464, "num_filter_entries": 4464, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398218, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.425516) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 10609735 bytes
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.426941) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 44.3 rd, 34.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 4982, records dropped: 518 output_compression: NoCompression
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.426988) EVENT_LOG_v1 {"time_micros": 1764398218426968, "job": 10, "event": "compaction_finished", "compaction_time_micros": 309449, "compaction_time_cpu_micros": 25577, "output_level": 6, "num_output_files": 1, "total_output_size": 10609735, "num_input_records": 4982, "num_output_records": 4464, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218428125, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218430488, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.115702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:58 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:58 compute-2 python3.9[180440]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:58 compute-2 sudo[180437]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:59 compute-2 sudo[180593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzirarwssduxofbznzgptzlweadlzfou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398218.8804572-1197-67704274232842/AnsiballZ_systemd.py'
Nov 29 06:36:59 compute-2 sudo[180593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:59 compute-2 python3.9[180595]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:59 compute-2 sudo[180593]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:36:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:59.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:36:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:36:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:59.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:59 compute-2 sudo[180748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwlfhhqslbsauozskpvnvckzggegfxiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398219.6722322-1197-239423900685777/AnsiballZ_systemd.py'
Nov 29 06:36:59 compute-2 sudo[180748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:00 compute-2 python3.9[180750]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:00 compute-2 sudo[180748]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:00 compute-2 sudo[180904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoesrnrcyoxzsacpqxakrpfbdimkveii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398220.449779-1197-228901423113985/AnsiballZ_systemd.py'
Nov 29 06:37:00 compute-2 sudo[180904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:01 compute-2 python3.9[180906]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:01 compute-2 sudo[180904]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:01.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:01 compute-2 sudo[181059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewqbdlufgtosxzeefjkqhuiubyxxkzsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398221.3332295-1197-110234638401659/AnsiballZ_systemd.py'
Nov 29 06:37:01 compute-2 sudo[181059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:01.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:01 compute-2 ceph-mon[77142]: pgmap v736: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:01 compute-2 python3.9[181061]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:02 compute-2 sudo[181059]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:02 compute-2 sudo[181215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unbgfkrjurlntdybqcqeujqgazfblnza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398222.231728-1197-34908274637550/AnsiballZ_systemd.py'
Nov 29 06:37:02 compute-2 sudo[181215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:02 compute-2 python3.9[181217]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:02 compute-2 sudo[181215]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:03 compute-2 sudo[181370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddwifofgvkwctoejyhshypsxwjhualmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398222.9786408-1197-173964384978732/AnsiballZ_systemd.py'
Nov 29 06:37:03 compute-2 sudo[181370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:03 compute-2 ceph-mon[77142]: pgmap v737: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:03 compute-2 python3.9[181372]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:03.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:03 compute-2 sudo[181370]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:03.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:04 compute-2 sudo[181525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjyzjlhmwlzkgszqanidepijcncmrncg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398223.7689831-1197-68781015287094/AnsiballZ_systemd.py'
Nov 29 06:37:04 compute-2 sudo[181525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:04 compute-2 python3.9[181527]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:04 compute-2 sudo[181525]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:04 compute-2 ceph-mon[77142]: pgmap v738: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:04 compute-2 sudo[181681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfjneurgkqqxiqvpudeulfyxttbfnfki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398224.5623224-1197-86840846197964/AnsiballZ_systemd.py'
Nov 29 06:37:04 compute-2 sudo[181681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:05 compute-2 python3.9[181683]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:05 compute-2 sudo[181681]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:05.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:05 compute-2 sudo[181836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oedbbzkzssoorfrnvfqorzmefasowxpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398225.3582344-1197-64168047971550/AnsiballZ_systemd.py'
Nov 29 06:37:05 compute-2 sudo[181836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:37:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:05.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:37:06 compute-2 python3.9[181838]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:06 compute-2 sudo[181836]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:06 compute-2 ceph-mon[77142]: pgmap v739: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:06 compute-2 sudo[181992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbhdhdzzpqbonhojizorqdklmoqblqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398226.2924273-1197-214315750315747/AnsiballZ_systemd.py'
Nov 29 06:37:06 compute-2 sudo[181992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:06 compute-2 python3.9[181994]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:06 compute-2 sudo[181992]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:07 compute-2 ceph-mon[77142]: pgmap v740: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:07 compute-2 sudo[182147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgylsuhqgqchkdsprnbynqkghrinflop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398227.0660698-1197-260363378965873/AnsiballZ_systemd.py'
Nov 29 06:37:07 compute-2 sudo[182147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:07.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:07 compute-2 python3.9[182149]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:07 compute-2 sudo[182147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:07.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:08 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:08 compute-2 sudo[182178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:08 compute-2 sudo[182178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:08 compute-2 sudo[182178]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:08 compute-2 sudo[182203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:08 compute-2 sudo[182203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:08 compute-2 sudo[182203]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:09.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:09.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:10 compute-2 ceph-mon[77142]: pgmap v741: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:10 compute-2 sudo[182354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwlpjdqaervmxiduyvohougvdbfhuvno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398230.665386-1503-65977389384625/AnsiballZ_file.py'
Nov 29 06:37:10 compute-2 sudo[182354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:11 compute-2 python3.9[182356]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:11 compute-2 sudo[182354]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:11 compute-2 sudo[182506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kezgaivyqdkxkrcqjyqzgmygxnxfqpas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398231.2965603-1503-190701255089115/AnsiballZ_file.py'
Nov 29 06:37:11 compute-2 sudo[182506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:11.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:11 compute-2 python3.9[182508]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:11 compute-2 sudo[182506]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:11.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:12 compute-2 sudo[182658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoageafmqxmzffetoghdfnqgghkityfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398231.8585021-1503-186902394194088/AnsiballZ_file.py'
Nov 29 06:37:12 compute-2 sudo[182658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:12 compute-2 python3.9[182660]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:12 compute-2 sudo[182658]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:12 compute-2 sudo[182811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsayzdndnpvfsorpwpxokqydscpagcov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398232.4744227-1503-71783334098750/AnsiballZ_file.py'
Nov 29 06:37:12 compute-2 sudo[182811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:12 compute-2 python3.9[182813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:12 compute-2 sudo[182811]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:13 compute-2 sudo[182963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctvkwstajbpwvsaomzfvhlorlsbwwthm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398233.02727-1503-209052742033117/AnsiballZ_file.py'
Nov 29 06:37:13 compute-2 sudo[182963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:13 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:13 compute-2 ceph-mon[77142]: pgmap v742: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:13 compute-2 python3.9[182965]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:13 compute-2 sudo[182963]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:13.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:13.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:13 compute-2 sudo[183115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxmguugsirxtlrtlxkjgftipymymjnjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398233.6405804-1503-84264015513125/AnsiballZ_file.py'
Nov 29 06:37:13 compute-2 sudo[183115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:14 compute-2 python3.9[183117]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:14 compute-2 sudo[183115]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:14 compute-2 ceph-mon[77142]: pgmap v743: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:37:15.127 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:37:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:37:15.128 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:37:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:37:15.128 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:37:15 compute-2 sudo[183268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjxbnbfwchtfrulewkbcooooajgxgogt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398234.9310288-1633-68474357176721/AnsiballZ_stat.py'
Nov 29 06:37:15 compute-2 sudo[183268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:15 compute-2 python3.9[183270]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:15 compute-2 sudo[183268]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:15 compute-2 ceph-mon[77142]: pgmap v744: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:15.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:16 compute-2 sudo[183393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luqfaqhwxshszbriiqxofktlfozozyav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398234.9310288-1633-68474357176721/AnsiballZ_copy.py'
Nov 29 06:37:16 compute-2 sudo[183393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:16 compute-2 python3.9[183395]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398234.9310288-1633-68474357176721/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:16 compute-2 sudo[183393]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:16 compute-2 sudo[183546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkchgqfbsonxubrblkmjdpcdhbzztgsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398236.3487127-1633-251623601574031/AnsiballZ_stat.py'
Nov 29 06:37:16 compute-2 sudo[183546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:16 compute-2 python3.9[183548]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:16 compute-2 sudo[183546]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:17 compute-2 sudo[183682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eckbmlgbqoayoaqpxllwxslzvsqwoyso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398236.3487127-1633-251623601574031/AnsiballZ_copy.py'
Nov 29 06:37:17 compute-2 sudo[183682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:17 compute-2 podman[183645]: 2025-11-29 06:37:17.321016546 +0000 UTC m=+0.103284012 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 06:37:17 compute-2 python3.9[183688]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398236.3487127-1633-251623601574031/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:17 compute-2 sudo[183682]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:17.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:17 compute-2 sudo[183848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otgmttgogjyeqvalvwrqhmataskwacdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398237.5805945-1633-112677451003039/AnsiballZ_stat.py'
Nov 29 06:37:17 compute-2 sudo[183848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:17.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:18 compute-2 python3.9[183850]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:18 compute-2 sudo[183848]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:18 compute-2 sudo[183973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wywjzvkxnwhmigwunvgymvxhanlefwvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398237.5805945-1633-112677451003039/AnsiballZ_copy.py'
Nov 29 06:37:18 compute-2 sudo[183973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:18 compute-2 ceph-mon[77142]: pgmap v745: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:18 compute-2 python3.9[183975]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398237.5805945-1633-112677451003039/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:18 compute-2 sudo[183973]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:18 compute-2 sudo[184126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cehwqgexijieslguhbprxksswsfcgqqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398238.671048-1633-267694321109828/AnsiballZ_stat.py'
Nov 29 06:37:18 compute-2 sudo[184126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:19 compute-2 python3.9[184128]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:19 compute-2 sudo[184126]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:19 compute-2 sudo[184251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flzzcfxctqdbnlirycdnntmibnkwofoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398238.671048-1633-267694321109828/AnsiballZ_copy.py'
Nov 29 06:37:19 compute-2 sudo[184251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:19.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:19 compute-2 python3.9[184253]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398238.671048-1633-267694321109828/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:19 compute-2 sudo[184251]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:19.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:19 compute-2 ceph-mon[77142]: pgmap v746: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:20 compute-2 sudo[184403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blpirwaiaklcyyvczkscyzsgimsuzelo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398239.8748834-1633-171931559811636/AnsiballZ_stat.py'
Nov 29 06:37:20 compute-2 sudo[184403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:20 compute-2 podman[184405]: 2025-11-29 06:37:20.243783308 +0000 UTC m=+0.049941286 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:37:20 compute-2 python3.9[184406]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:20 compute-2 sudo[184403]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:20 compute-2 sudo[184548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkjodsycbztsilyincptvqupsdupzxdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398239.8748834-1633-171931559811636/AnsiballZ_copy.py'
Nov 29 06:37:20 compute-2 sudo[184548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:21 compute-2 python3.9[184550]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398239.8748834-1633-171931559811636/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:21 compute-2 sudo[184548]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:21 compute-2 sudo[184700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thqsqicdgxndbtjfnwpsjapbvjlynvsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398241.2906885-1633-214051068524991/AnsiballZ_stat.py'
Nov 29 06:37:21 compute-2 sudo[184700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:21.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:21 compute-2 python3.9[184702]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:21 compute-2 sudo[184700]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:21.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:21 compute-2 ceph-mon[77142]: pgmap v747: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:22 compute-2 sudo[184827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxxxphupbpqftyjilvqtapotnuxpcjmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398241.2906885-1633-214051068524991/AnsiballZ_copy.py'
Nov 29 06:37:22 compute-2 sudo[184827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:22 compute-2 sshd-session[184703]: Invalid user www from 92.118.39.92 port 47236
Nov 29 06:37:22 compute-2 python3.9[184829]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398241.2906885-1633-214051068524991/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:22 compute-2 sudo[184827]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:22 compute-2 sshd-session[184703]: Connection closed by invalid user www 92.118.39.92 port 47236 [preauth]
Nov 29 06:37:22 compute-2 sudo[184980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybrcyibbosernmuiiqvexkjawvsvhsbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398242.4341588-1633-267192192703320/AnsiballZ_stat.py'
Nov 29 06:37:22 compute-2 sudo[184980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:22 compute-2 python3.9[184982]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:22 compute-2 sudo[184980]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:23 compute-2 sudo[185103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxdinzsptwceshtyaoeeamgxuwjtkrkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398242.4341588-1633-267192192703320/AnsiballZ_copy.py'
Nov 29 06:37:23 compute-2 sudo[185103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:23 compute-2 python3.9[185105]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398242.4341588-1633-267192192703320/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:23 compute-2 sudo[185103]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:23.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:23 compute-2 sudo[185255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-resmthjfblpsnkomeaxsklmlszyznlwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398243.5279498-1633-28039942833854/AnsiballZ_stat.py'
Nov 29 06:37:23 compute-2 sudo[185255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:37:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:23.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:37:23 compute-2 python3.9[185257]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:23 compute-2 sudo[185255]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:24 compute-2 sudo[185380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvixujjuwovgppctbnkdcybjsxyktyfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398243.5279498-1633-28039942833854/AnsiballZ_copy.py'
Nov 29 06:37:24 compute-2 sudo[185380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:24 compute-2 python3.9[185382]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398243.5279498-1633-28039942833854/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:24 compute-2 sudo[185380]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:25.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:25 compute-2 ceph-mon[77142]: pgmap v748: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:25.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:27 compute-2 sudo[185534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vizesigvnbbgrlfpbxyvpmjzuvpfigvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398246.6770127-1972-123608564973146/AnsiballZ_command.py'
Nov 29 06:37:27 compute-2 sudo[185534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:27 compute-2 python3.9[185536]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 29 06:37:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:27.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:27.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:28 compute-2 sudo[185539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:28 compute-2 sudo[185539]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:28 compute-2 sudo[185539]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:28 compute-2 sudo[185564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:28 compute-2 sudo[185564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:28 compute-2 sudo[185564]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:29.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:29.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:30 compute-2 sudo[185534]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:30 compute-2 sudo[185739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfywkxclasutlrsvldswqlqbjgywcxjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398250.522918-1999-170981938924435/AnsiballZ_file.py'
Nov 29 06:37:30 compute-2 sudo[185739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:31 compute-2 python3.9[185741]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:31 compute-2 sudo[185739]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:31 compute-2 ceph-mon[77142]: pgmap v749: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:31 compute-2 sudo[185891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlgwjpdmfqxnhtebecvinamfcphkmndy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398251.2240949-1999-52503931774576/AnsiballZ_file.py'
Nov 29 06:37:31 compute-2 sudo[185891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:31.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:31 compute-2 python3.9[185893]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:31 compute-2 sudo[185891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:31.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:32 compute-2 sudo[186043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnntyfpuddwaiwosxscjsrzjexgyfoco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398251.8603256-1999-20771765449007/AnsiballZ_file.py'
Nov 29 06:37:32 compute-2 sudo[186043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:32 compute-2 python3.9[186045]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:32 compute-2 sudo[186043]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:32 compute-2 sudo[186196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prpooibuovnkqovgsufxxsihlvmekxns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398252.4662645-1999-159136229285359/AnsiballZ_file.py'
Nov 29 06:37:32 compute-2 sudo[186196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:32 compute-2 python3.9[186198]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:32 compute-2 sudo[186196]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:33 compute-2 sudo[186348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzjqutsbtyreqlsotfwleriqqslfuqfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398253.1224873-1999-219625784313844/AnsiballZ_file.py'
Nov 29 06:37:33 compute-2 sudo[186348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:33.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:33 compute-2 python3.9[186350]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:33 compute-2 sudo[186348]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:33.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:34 compute-2 sudo[186500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwdmqwuotinfqdmtpxeabpunfrytjvyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398253.8058174-1999-256101072443159/AnsiballZ_file.py'
Nov 29 06:37:34 compute-2 sudo[186500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:34 compute-2 python3.9[186502]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:34 compute-2 sudo[186500]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:34 compute-2 sudo[186653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyiqfrxglzaqcvivbjhelbpjwrcdsgsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398254.4144783-1999-182076006692797/AnsiballZ_file.py'
Nov 29 06:37:34 compute-2 sudo[186653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:34 compute-2 python3.9[186655]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:34 compute-2 sudo[186653]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:35 compute-2 sudo[186805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cllnlworbgatpfrpwbqlndhaueyzjkam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398255.1445851-1999-153981320198811/AnsiballZ_file.py'
Nov 29 06:37:35 compute-2 sudo[186805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:35.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:35 compute-2 python3.9[186807]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:35 compute-2 sudo[186805]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:35.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:36 compute-2 sudo[186957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qokpgttizynsadmmruxuikvsuuqznrrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398255.8730712-1999-88833946583147/AnsiballZ_file.py'
Nov 29 06:37:36 compute-2 sudo[186957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:36 compute-2 python3.9[186959]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:36 compute-2 sudo[186957]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:37 compute-2 sudo[187110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrrjfexkouidjnxhrretxqoktcykwsye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398256.7038403-1999-213836385050480/AnsiballZ_file.py'
Nov 29 06:37:37 compute-2 sudo[187110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:37 compute-2 python3.9[187112]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:37 compute-2 sudo[187110]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:37.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:37 compute-2 sudo[187262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvllutmxihfsynviqqxtrgpbjshlfeph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398257.4912703-1999-231942237478919/AnsiballZ_file.py'
Nov 29 06:37:37 compute-2 sudo[187262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:37.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:38 compute-2 ceph-mon[77142]: pgmap v750: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:38 compute-2 ceph-mon[77142]: pgmap v751: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:38 compute-2 ceph-mon[77142]: pgmap v752: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:38 compute-2 python3.9[187264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:38 compute-2 sudo[187262]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:38 compute-2 sudo[187415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-galpdgbkdnuvuqikpkmwpyssycrxmfgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398258.1830215-1999-198918411498385/AnsiballZ_file.py'
Nov 29 06:37:38 compute-2 sudo[187415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:38 compute-2 python3.9[187417]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:38 compute-2 sudo[187415]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:38 compute-2 ceph-mon[77142]: pgmap v753: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:38 compute-2 ceph-mon[77142]: pgmap v754: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:39 compute-2 sudo[187567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srsqcqbbszseqypezfmveuxxnuveivoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398258.7838893-1999-2449280087643/AnsiballZ_file.py'
Nov 29 06:37:39 compute-2 sudo[187567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:39 compute-2 python3.9[187569]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:39 compute-2 sudo[187567]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:39.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:39 compute-2 sudo[187719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfcitdmyhbcuhbbokcfiswruydxbvasc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398259.4318247-1999-242324761103991/AnsiballZ_file.py'
Nov 29 06:37:39 compute-2 sudo[187719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:39.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:39 compute-2 python3.9[187721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:39 compute-2 sudo[187719]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:40 compute-2 ceph-mon[77142]: pgmap v755: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:40 compute-2 ceph-mon[77142]: pgmap v756: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:41 compute-2 sudo[187747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:41 compute-2 sudo[187747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-2 sudo[187747]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:41 compute-2 sudo[187772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:37:41 compute-2 sudo[187772]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-2 sudo[187772]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:41 compute-2 sudo[187797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:41 compute-2 sudo[187797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-2 sudo[187797]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:41 compute-2 sudo[187822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:37:41 compute-2 sudo[187822]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:41.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:41.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:41 compute-2 sudo[188002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnreuwfokagoowkadqfjhudhixnjlzsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398261.6236115-2296-210325227478095/AnsiballZ_stat.py'
Nov 29 06:37:41 compute-2 sudo[188002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:41 compute-2 sudo[187822]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-2 python3.9[188004]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:42 compute-2 sudo[188002]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-2 sudo[188126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-werhgniwkaobsqjclutlcbhckywvgdwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398261.6236115-2296-210325227478095/AnsiballZ_copy.py'
Nov 29 06:37:42 compute-2 sudo[188126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:42 compute-2 python3.9[188128]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398261.6236115-2296-210325227478095/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:42 compute-2 sudo[188126]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-2 ceph-mon[77142]: pgmap v757: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:43 compute-2 sudo[188278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfxqcdyubvmnkerhnoviytwcwhwvvelq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398262.8625185-2296-229010145427433/AnsiballZ_stat.py'
Nov 29 06:37:43 compute-2 sudo[188278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:43 compute-2 python3.9[188280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:43 compute-2 sudo[188278]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:43.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:43 compute-2 sudo[188401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkpoenlurquyrmuuipalpnrarjajuirf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398262.8625185-2296-229010145427433/AnsiballZ_copy.py'
Nov 29 06:37:43 compute-2 sudo[188401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:43 compute-2 python3.9[188403]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398262.8625185-2296-229010145427433/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:43 compute-2 sudo[188401]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:43.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:44 compute-2 sudo[188553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydrghdajyknpeiahxolztnblbouzrzmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398264.0138283-2296-222085802431806/AnsiballZ_stat.py'
Nov 29 06:37:44 compute-2 sudo[188553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:44 compute-2 python3.9[188555]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:44 compute-2 sudo[188553]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:44 compute-2 sudo[188677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nttaosycwvyifiavvqvqlkfnqyndcrxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398264.0138283-2296-222085802431806/AnsiballZ_copy.py'
Nov 29 06:37:44 compute-2 sudo[188677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:45 compute-2 python3.9[188679]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398264.0138283-2296-222085802431806/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:45 compute-2 sudo[188677]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:45 compute-2 sudo[188829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nojzftltarkomndjipcylwiitzamzawt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398265.2459807-2296-234555465417789/AnsiballZ_stat.py'
Nov 29 06:37:45 compute-2 sudo[188829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:45 compute-2 ceph-mon[77142]: pgmap v758: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:45.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:45 compute-2 python3.9[188831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:45 compute-2 sudo[188829]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:45.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:46 compute-2 sudo[188952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdxfllfduupkkzqpqdowgcxjsvoszwnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398265.2459807-2296-234555465417789/AnsiballZ_copy.py'
Nov 29 06:37:46 compute-2 sudo[188952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:46 compute-2 python3.9[188954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398265.2459807-2296-234555465417789/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:46 compute-2 sudo[188952]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:47 compute-2 sudo[189105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skaifnsigjlottqzmdnmqueuicgqfagc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398266.4155471-2296-61651620805441/AnsiballZ_stat.py'
Nov 29 06:37:47 compute-2 sudo[189105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:47 compute-2 python3.9[189107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:47 compute-2 ceph-mon[77142]: pgmap v759: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:47 compute-2 sudo[189105]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 06:37:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:47.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 06:37:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:47.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:47 compute-2 sudo[189254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgzvlvdwslrqahbkuusobibztvbfykl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398266.4155471-2296-61651620805441/AnsiballZ_copy.py'
Nov 29 06:37:47 compute-2 sudo[189254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:47 compute-2 podman[189178]: 2025-11-29 06:37:47.973114535 +0000 UTC m=+0.126441976 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:37:48 compute-2 python3.9[189256]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398266.4155471-2296-61651620805441/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:48 compute-2 sudo[189254]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:48 compute-2 sudo[189407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enscrgtpmltnvgnbpfybghwayeabgicm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398268.321921-2296-78086711225755/AnsiballZ_stat.py'
Nov 29 06:37:48 compute-2 sudo[189407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:48 compute-2 python3.9[189409]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:48 compute-2 sudo[189407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:48 compute-2 sudo[189410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:48 compute-2 sudo[189410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:48 compute-2 sudo[189410]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:48 compute-2 ceph-mon[77142]: pgmap v760: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:37:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:37:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:37:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:37:48 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:37:48 compute-2 sudo[189452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:48 compute-2 sudo[189452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:48 compute-2 sudo[189452]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:49 compute-2 sudo[189580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcdjkfdikfwxmjipoxbqvorfcbizixrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398268.321921-2296-78086711225755/AnsiballZ_copy.py'
Nov 29 06:37:49 compute-2 sudo[189580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:49 compute-2 python3.9[189582]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398268.321921-2296-78086711225755/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:49 compute-2 sudo[189580]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:49.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:49.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:49 compute-2 sudo[189732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqirrmwfmxymsnlqcpvewriiggkfskka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398269.6306927-2296-66826701042962/AnsiballZ_stat.py'
Nov 29 06:37:49 compute-2 sudo[189732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:49 compute-2 ceph-mon[77142]: pgmap v761: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:50 compute-2 python3.9[189734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:50 compute-2 sudo[189732]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:50 compute-2 sudo[189869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdjgradctvvrqoevqrcbvrznwkgsleez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398269.6306927-2296-66826701042962/AnsiballZ_copy.py'
Nov 29 06:37:50 compute-2 sudo[189869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:50 compute-2 podman[189829]: 2025-11-29 06:37:50.47070735 +0000 UTC m=+0.064079326 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:37:50 compute-2 python3.9[189875]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398269.6306927-2296-66826701042962/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:50 compute-2 sudo[189869]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:51 compute-2 sudo[190027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfyqcccnqocierbbysiqqfmzvhgulndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398270.8204887-2296-211885001576689/AnsiballZ_stat.py'
Nov 29 06:37:51 compute-2 sudo[190027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:51 compute-2 python3.9[190029]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:51 compute-2 sudo[190027]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:51.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:51 compute-2 sudo[190150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhamgvemodhedoyzxmotbftedujgkxru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398270.8204887-2296-211885001576689/AnsiballZ_copy.py'
Nov 29 06:37:51 compute-2 sudo[190150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:51 compute-2 python3.9[190152]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398270.8204887-2296-211885001576689/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:51 compute-2 sudo[190150]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:52 compute-2 sudo[190302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-purovnxytaoguxatmfwngxpmsancoqcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398271.9884522-2296-154282123026614/AnsiballZ_stat.py'
Nov 29 06:37:52 compute-2 sudo[190302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:52 compute-2 ceph-mon[77142]: pgmap v762: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:52 compute-2 python3.9[190304]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:52 compute-2 sudo[190302]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:52 compute-2 sudo[190426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utcsupegjoaewjczsleugdxwhodlwqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398271.9884522-2296-154282123026614/AnsiballZ_copy.py'
Nov 29 06:37:52 compute-2 sudo[190426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:52 compute-2 python3.9[190428]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398271.9884522-2296-154282123026614/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:52 compute-2 sudo[190426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:53 compute-2 ceph-mon[77142]: pgmap v763: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:37:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:53.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:37:53 compute-2 sudo[190578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaqhthzsowcphctiuoleyifupzdalnza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398273.0850105-2296-78084349766886/AnsiballZ_stat.py'
Nov 29 06:37:53 compute-2 sudo[190578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 06:37:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:53.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 06:37:53 compute-2 python3.9[190580]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:53 compute-2 sudo[190578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:54 compute-2 sudo[190701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqwtumcbbxrpjxjniaumirogcailjesm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398273.0850105-2296-78084349766886/AnsiballZ_copy.py'
Nov 29 06:37:54 compute-2 sudo[190701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:54 compute-2 python3.9[190703]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398273.0850105-2296-78084349766886/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:54 compute-2 sudo[190701]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:54 compute-2 auditd[699]: Audit daemon rotating log files
Nov 29 06:37:54 compute-2 sudo[190854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jycgovxsjlfhuxihmuylvumegyutzngx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398274.5954678-2296-74899851920383/AnsiballZ_stat.py'
Nov 29 06:37:54 compute-2 sudo[190854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:55 compute-2 python3.9[190856]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:55 compute-2 sudo[190854]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:55 compute-2 sudo[190977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubmpkzzewupnzgvmcknjqmlfufgwuiga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398274.5954678-2296-74899851920383/AnsiballZ_copy.py'
Nov 29 06:37:55 compute-2 sudo[190977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:55 compute-2 python3.9[190979]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398274.5954678-2296-74899851920383/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:37:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:55.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:37:55 compute-2 sudo[190977]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:55.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:56 compute-2 sudo[191130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkwzgueukrgzmazesslsmpslkdbhtckq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398275.825394-2296-194733351450830/AnsiballZ_stat.py'
Nov 29 06:37:56 compute-2 sudo[191130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:56 compute-2 python3.9[191132]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:56 compute-2 sudo[191130]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:57 compute-2 sudo[191253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oougrrujusuvwmdcxazsmzlepvmohvxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398275.825394-2296-194733351450830/AnsiballZ_copy.py'
Nov 29 06:37:57 compute-2 sudo[191253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:57 compute-2 python3.9[191255]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398275.825394-2296-194733351450830/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:57 compute-2 sudo[191253]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:57.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:57.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:58 compute-2 sudo[191405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jezjinrsqoizxtpfgxsshonwsftcgsuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398277.8263867-2296-192359018781522/AnsiballZ_stat.py'
Nov 29 06:37:58 compute-2 sudo[191405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:58 compute-2 ceph-mon[77142]: pgmap v764: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:58 compute-2 python3.9[191407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:58 compute-2 sudo[191405]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:58 compute-2 sudo[191529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnjcnbahcepeaypgjsaxosvspgzwknhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398277.8263867-2296-192359018781522/AnsiballZ_copy.py'
Nov 29 06:37:58 compute-2 sudo[191529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:58 compute-2 python3.9[191531]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398277.8263867-2296-192359018781522/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:58 compute-2 sudo[191529]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:59 compute-2 sudo[191681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlywamlcqmcgviytdlrkskjboxnhwpps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398279.0187278-2296-239034984973653/AnsiballZ_stat.py'
Nov 29 06:37:59 compute-2 sudo[191681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:59 compute-2 python3.9[191683]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:59 compute-2 sudo[191681]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:59.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:37:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:37:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:59.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:37:59 compute-2 sudo[191804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aegjemsrmzzpeqefwcwgghjkdibjoykq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398279.0187278-2296-239034984973653/AnsiballZ_copy.py'
Nov 29 06:37:59 compute-2 sudo[191804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:00 compute-2 python3.9[191806]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398279.0187278-2296-239034984973653/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:00 compute-2 sudo[191804]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:00 compute-2 ceph-mon[77142]: pgmap v765: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:01.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:01.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:02 compute-2 python3.9[191957]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:03 compute-2 sudo[192111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mudfoojoumexbzxrqceucuwohoeiqmlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398282.6365993-2914-18202862738864/AnsiballZ_seboolean.py'
Nov 29 06:38:03 compute-2 sudo[192111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:03.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:03.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:04 compute-2 python3.9[192113]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 29 06:38:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:05 compute-2 ceph-mon[77142]: pgmap v766: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:05 compute-2 ceph-mon[77142]: pgmap v767: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:05 compute-2 sudo[192111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:05.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:05.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:07 compute-2 sudo[192144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:07 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 29 06:38:07 compute-2 sudo[192144]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:07 compute-2 sudo[192144]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:07 compute-2 sudo[192169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:38:07 compute-2 sudo[192169]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:07 compute-2 sudo[192169]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:38:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:07.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:38:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:07.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:08 compute-2 sudo[192320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ongylvbdnmdiqhbuoffxvqofgvxbbozm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398288.1949198-2937-65620093081998/AnsiballZ_copy.py'
Nov 29 06:38:08 compute-2 sudo[192320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:08 compute-2 python3.9[192322]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:08 compute-2 sudo[192320]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:09 compute-2 sudo[192370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:09 compute-2 sudo[192370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:09 compute-2 sudo[192370]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:09 compute-2 sudo[192424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:09 compute-2 sudo[192424]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:09 compute-2 sudo[192424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:09 compute-2 sudo[192522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmefbunprorapuvpzijqkodslxmivpxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398288.9434655-2937-190510177035903/AnsiballZ_copy.py'
Nov 29 06:38:09 compute-2 sudo[192522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:09 compute-2 python3.9[192524]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:09 compute-2 sudo[192522]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:38:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:09.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:38:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:09.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:10 compute-2 sudo[192674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxlyshihpsynldwydtieqdbskjatrvgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398289.6866004-2937-268060865969491/AnsiballZ_copy.py'
Nov 29 06:38:10 compute-2 sudo[192674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:10 compute-2 python3.9[192676]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:10 compute-2 sudo[192674]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:10 compute-2 ceph-mon[77142]: pgmap v768: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:10 compute-2 ceph-mon[77142]: pgmap v769: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:10 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:38:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:10 compute-2 sudo[192828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-solgpkxmjdlobqmbyelfbqyolqpfksve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398290.4820848-2937-261216595029592/AnsiballZ_copy.py'
Nov 29 06:38:10 compute-2 sudo[192828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:10 compute-2 python3.9[192830]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:10 compute-2 sudo[192828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:11 compute-2 sudo[192980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzhnfczddljgylctpivxqyisgoxewdaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398291.079193-2937-2321470325807/AnsiballZ_copy.py'
Nov 29 06:38:11 compute-2 sudo[192980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:11 compute-2 ceph-mon[77142]: pgmap v770: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:38:11 compute-2 python3.9[192982]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:11 compute-2 sudo[192980]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:11.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:11.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:12 compute-2 sudo[193132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oegsslhqbqgicgmwiwjertmhmqrpaugs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398291.9927418-3045-76084055344712/AnsiballZ_copy.py'
Nov 29 06:38:12 compute-2 sudo[193132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:12 compute-2 python3.9[193134]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:12 compute-2 sudo[193132]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:12 compute-2 ceph-mon[77142]: pgmap v771: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:12 compute-2 ceph-mon[77142]: pgmap v772: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:12 compute-2 sudo[193285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dokinkjdzewkjccvgtjoiekyxhdwxjaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398292.6700842-3045-191138884944628/AnsiballZ_copy.py'
Nov 29 06:38:12 compute-2 sudo[193285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:13 compute-2 python3.9[193287]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:13 compute-2 sudo[193285]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:13 compute-2 sudo[193437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yihgbqrfygdqgcfbagspiiyziwmqmwvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398293.2805166-3045-251409094243957/AnsiballZ_copy.py'
Nov 29 06:38:13 compute-2 sudo[193437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:38:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:13.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:38:13 compute-2 python3.9[193439]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:13 compute-2 sudo[193437]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:13.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:14 compute-2 sudo[193589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpapvfuujrsdhsicuazjswqzkazfhlgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398293.927095-3045-153139348287280/AnsiballZ_copy.py'
Nov 29 06:38:14 compute-2 sudo[193589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:14 compute-2 python3.9[193591]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:14 compute-2 sudo[193589]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:14 compute-2 ceph-mon[77142]: pgmap v773: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:14 compute-2 sudo[193742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epkacvxndqwvwooilzhzrvkilyqsxqdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398294.6717548-3045-241390367605937/AnsiballZ_copy.py'
Nov 29 06:38:14 compute-2 sudo[193742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:38:15.129 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:38:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:38:15.130 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:38:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:38:15.130 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:38:15 compute-2 python3.9[193744]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:15 compute-2 sudo[193742]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:15.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:15.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:15 compute-2 sudo[193894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfvoclxckiialmyvhajxuustafafkjoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398295.7218928-3153-176296322482353/AnsiballZ_systemd.py'
Nov 29 06:38:15 compute-2 sudo[193894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:16 compute-2 ceph-mon[77142]: pgmap v774: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:16 compute-2 python3.9[193896]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:16 compute-2 systemd[1]: Reloading.
Nov 29 06:38:16 compute-2 systemd-sysv-generator[193927]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:16 compute-2 systemd-rc-local-generator[193924]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:16 compute-2 systemd[1]: Starting libvirt logging daemon socket...
Nov 29 06:38:16 compute-2 systemd[1]: Listening on libvirt logging daemon socket.
Nov 29 06:38:16 compute-2 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 29 06:38:16 compute-2 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 29 06:38:16 compute-2 systemd[1]: Starting libvirt logging daemon...
Nov 29 06:38:16 compute-2 systemd[1]: Started libvirt logging daemon.
Nov 29 06:38:16 compute-2 sudo[193894]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:17 compute-2 ceph-mon[77142]: pgmap v775: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:17 compute-2 sudo[194088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjgbpidpxwyscucxuogohedibbboygyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398297.0893967-3153-163850048003767/AnsiballZ_systemd.py'
Nov 29 06:38:17 compute-2 sudo[194088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:17.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:17 compute-2 python3.9[194090]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:17 compute-2 systemd[1]: Reloading.
Nov 29 06:38:17 compute-2 systemd-rc-local-generator[194116]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:17 compute-2 systemd-sysv-generator[194119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:17.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:18 compute-2 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 29 06:38:18 compute-2 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 29 06:38:18 compute-2 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 29 06:38:18 compute-2 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 29 06:38:18 compute-2 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 29 06:38:18 compute-2 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 29 06:38:18 compute-2 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 29 06:38:18 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 06:38:18 compute-2 systemd[1]: Started libvirt nodedev daemon.
Nov 29 06:38:18 compute-2 sudo[194088]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:18 compute-2 podman[194127]: 2025-11-29 06:38:18.213601642 +0000 UTC m=+0.159232301 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 06:38:18 compute-2 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 29 06:38:18 compute-2 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 29 06:38:18 compute-2 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 29 06:38:18 compute-2 sudo[194337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgtvpqowpwagjeyxbcwyxmvesovhcsjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398298.2965007-3153-89100282825326/AnsiballZ_systemd.py'
Nov 29 06:38:18 compute-2 sudo[194337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:18 compute-2 python3.9[194339]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:18 compute-2 systemd[1]: Reloading.
Nov 29 06:38:19 compute-2 systemd-rc-local-generator[194366]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:19 compute-2 systemd-sysv-generator[194370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:19 compute-2 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 29 06:38:19 compute-2 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 29 06:38:19 compute-2 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 29 06:38:19 compute-2 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 29 06:38:19 compute-2 systemd[1]: Starting libvirt proxy daemon...
Nov 29 06:38:19 compute-2 systemd[1]: Started libvirt proxy daemon.
Nov 29 06:38:19 compute-2 sudo[194337]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:19 compute-2 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 745ff3a9-6485-4737-8d50-c2f2d563dc3c
Nov 29 06:38:19 compute-2 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 29 06:38:19 compute-2 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 745ff3a9-6485-4737-8d50-c2f2d563dc3c
Nov 29 06:38:19 compute-2 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 29 06:38:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:19.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:19.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:19 compute-2 sudo[194550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klqklelmlsnoybavzalybjcekxgagdux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398299.56556-3153-98284890458425/AnsiballZ_systemd.py'
Nov 29 06:38:20 compute-2 sudo[194550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:20 compute-2 python3.9[194552]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:20 compute-2 systemd[1]: Reloading.
Nov 29 06:38:20 compute-2 systemd-sysv-generator[194582]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:20 compute-2 systemd-rc-local-generator[194578]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:20 compute-2 ceph-mon[77142]: pgmap v776: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:20 compute-2 systemd[1]: Listening on libvirt locking daemon socket.
Nov 29 06:38:20 compute-2 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 29 06:38:20 compute-2 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 29 06:38:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:20 compute-2 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 29 06:38:20 compute-2 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 29 06:38:20 compute-2 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 29 06:38:20 compute-2 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 29 06:38:20 compute-2 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 29 06:38:20 compute-2 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 29 06:38:20 compute-2 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 29 06:38:20 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 06:38:20 compute-2 podman[194590]: 2025-11-29 06:38:20.669198575 +0000 UTC m=+0.053438713 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 06:38:20 compute-2 systemd[1]: Started libvirt QEMU daemon.
Nov 29 06:38:20 compute-2 sudo[194550]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:21 compute-2 sudo[194785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keupybeavsgqyyflhmptqjpgckuqdjol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398300.8940923-3153-233952496087362/AnsiballZ_systemd.py'
Nov 29 06:38:21 compute-2 sudo[194785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:21 compute-2 python3.9[194787]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:21 compute-2 systemd[1]: Reloading.
Nov 29 06:38:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:21.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:21 compute-2 systemd-sysv-generator[194820]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:21 compute-2 systemd-rc-local-generator[194816]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:21 compute-2 ceph-mon[77142]: pgmap v777: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:21.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:21 compute-2 systemd[1]: Starting libvirt secret daemon socket...
Nov 29 06:38:21 compute-2 systemd[1]: Listening on libvirt secret daemon socket.
Nov 29 06:38:21 compute-2 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 29 06:38:21 compute-2 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 29 06:38:21 compute-2 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 29 06:38:21 compute-2 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 29 06:38:21 compute-2 systemd[1]: Starting libvirt secret daemon...
Nov 29 06:38:21 compute-2 systemd[1]: Started libvirt secret daemon.
Nov 29 06:38:22 compute-2 sudo[194785]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 06:38:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:23.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 06:38:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:23.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:24 compute-2 ceph-mon[77142]: pgmap v778: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:24 compute-2 sudo[195000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isegrjgxzhtrgwcvdtmyrtmxsztdxsgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398304.1876066-3265-91020121678553/AnsiballZ_file.py'
Nov 29 06:38:24 compute-2 sudo[195000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:24 compute-2 python3.9[195002]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:24 compute-2 sudo[195000]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:25 compute-2 sudo[195152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjftvtvfbwkfdoztmoldbxseftlydrxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398305.2793055-3288-14781642872132/AnsiballZ_find.py'
Nov 29 06:38:25 compute-2 sudo[195152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:25.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:25 compute-2 python3.9[195154]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:38:25 compute-2 sudo[195152]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 06:38:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:25.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 06:38:26 compute-2 sudo[195305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qirzxbccurttumkhraoavhesgfylfhjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398306.1875873-3312-173628089320694/AnsiballZ_command.py'
Nov 29 06:38:26 compute-2 sudo[195305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:26 compute-2 ceph-mon[77142]: pgmap v779: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:26 compute-2 python3.9[195307]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:26 compute-2 sudo[195305]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:38:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:27.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:38:27 compute-2 ceph-mon[77142]: pgmap v780: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:27.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:28 compute-2 python3.9[195461]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:38:29 compute-2 sudo[195595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:29 compute-2 sudo[195595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:29 compute-2 sudo[195595]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:29 compute-2 sudo[195638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:29 compute-2 sudo[195638]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:29 compute-2 sudo[195638]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:29 compute-2 python3.9[195633]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:29 compute-2 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 29 06:38:29 compute-2 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 29 06:38:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:29.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:29 compute-2 python3.9[195783]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398308.7991729-3370-237674090291383/.source.xml follow=False _original_basename=secret.xml.j2 checksum=63744b3abb892aaab98ed7226f328ffc66ff66bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:38:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:29.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:38:30 compute-2 ceph-mon[77142]: pgmap v781: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:30 compute-2 sudo[195934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hodumjwogstovkaqamccpxmpksxdvqeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398310.3166249-3414-185412626519615/AnsiballZ_command.py'
Nov 29 06:38:30 compute-2 sudo[195934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:30 compute-2 python3.9[195936]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 336ec58c-893b-528f-a0c1-6ed1196bc047
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:30 compute-2 polkitd[43476]: Registered Authentication Agent for unix-process:195938:377947 (system bus name :1.1926 [pkttyagent --process 195938 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Nov 29 06:38:30 compute-2 polkitd[43476]: Unregistered Authentication Agent for unix-process:195938:377947 (system bus name :1.1926, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Nov 29 06:38:30 compute-2 polkitd[43476]: Registered Authentication Agent for unix-process:195937:377947 (system bus name :1.1927 [pkttyagent --process 195937 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Nov 29 06:38:30 compute-2 polkitd[43476]: Unregistered Authentication Agent for unix-process:195937:377947 (system bus name :1.1927, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Nov 29 06:38:30 compute-2 sudo[195934]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:31 compute-2 ceph-mon[77142]: pgmap v782: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:31 compute-2 python3.9[196098]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:38:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:31.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:38:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:31.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:32 compute-2 sudo[196248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teovznfigccrbooxlaiujengzcdeefwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398311.957699-3463-87952323745797/AnsiballZ_command.py'
Nov 29 06:38:32 compute-2 sudo[196248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:32 compute-2 sudo[196248]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:33 compute-2 sudo[196402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiklkgqmvyapaphpkfvjsvnyebljjdmz ; FSID=336ec58c-893b-528f-a0c1-6ed1196bc047 KEY=AQCBjyppAAAAABAAXQRTF6pnk4WV7TfvJo0Mjg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398312.7882848-3486-104642420699380/AnsiballZ_command.py'
Nov 29 06:38:33 compute-2 sudo[196402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:33 compute-2 polkitd[43476]: Registered Authentication Agent for unix-process:196405:378187 (system bus name :1.1930 [pkttyagent --process 196405 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Nov 29 06:38:33 compute-2 polkitd[43476]: Unregistered Authentication Agent for unix-process:196405:378187 (system bus name :1.1930, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Nov 29 06:38:33 compute-2 sudo[196402]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:33 compute-2 ceph-mon[77142]: pgmap v783: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000062s ======
Nov 29 06:38:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:33.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000062s
Nov 29 06:38:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 06:38:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:33.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 06:38:33 compute-2 sudo[196560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahuqekxkprrwxnmaoezlaxgmnaulhbmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398313.5914233-3510-58766290634918/AnsiballZ_copy.py'
Nov 29 06:38:33 compute-2 sudo[196560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:34 compute-2 python3.9[196562]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:34 compute-2 sudo[196560]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:34 compute-2 sudo[196713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycgtepknkmzvrobazcgxzszmeywzqbsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398314.4277127-3536-46108208255348/AnsiballZ_stat.py'
Nov 29 06:38:34 compute-2 sudo[196713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:34 compute-2 python3.9[196715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:34 compute-2 sudo[196713]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:35 compute-2 sudo[196836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enpssmqpycmyhfmnpzbgdtjazeyvqefq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398314.4277127-3536-46108208255348/AnsiballZ_copy.py'
Nov 29 06:38:35 compute-2 sudo[196836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:35 compute-2 python3.9[196838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398314.4277127-3536-46108208255348/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:35 compute-2 sudo[196836]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:35.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:35 compute-2 ceph-mon[77142]: pgmap v784: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:35.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:36 compute-2 sudo[196988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnjlstlrrjibogndomvyjztutpnuruyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398316.0513372-3583-33808743212276/AnsiballZ_file.py'
Nov 29 06:38:36 compute-2 sudo[196988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:36 compute-2 python3.9[196990]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:36 compute-2 sudo[196988]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:37 compute-2 sudo[197141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjdlomqyjhntonoxeljxzhdxflrupxig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398316.77119-3607-146940705057141/AnsiballZ_stat.py'
Nov 29 06:38:37 compute-2 sudo[197141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:37 compute-2 python3.9[197143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:37 compute-2 sudo[197141]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:37 compute-2 sudo[197219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idngnqsjnbpjttwbsxftkkebalhdcjio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398316.77119-3607-146940705057141/AnsiballZ_file.py'
Nov 29 06:38:37 compute-2 sudo[197219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:37.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:37 compute-2 python3.9[197221]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:37 compute-2 sudo[197219]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:37.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:38 compute-2 ceph-mon[77142]: pgmap v785: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:38 compute-2 sudo[197371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tndrmnpsyyastjwqhbhjewzaqwfcihlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398318.0552256-3643-112612970903880/AnsiballZ_stat.py'
Nov 29 06:38:38 compute-2 sudo[197371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:38 compute-2 python3.9[197373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:38 compute-2 sudo[197371]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:38 compute-2 sudo[197450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugucdzjkcoishezzmkmzjtdpntbvmpjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398318.0552256-3643-112612970903880/AnsiballZ_file.py'
Nov 29 06:38:38 compute-2 sudo[197450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:38 compute-2 python3.9[197452]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.uzdfzrlg recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:38 compute-2 sudo[197450]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:39 compute-2 sudo[197602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keiwcpohmlhpqwkledfeubztcejxpfoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398319.3841121-3679-228651657113877/AnsiballZ_stat.py'
Nov 29 06:38:39 compute-2 sudo[197602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:39.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:39 compute-2 python3.9[197604]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:39 compute-2 sudo[197602]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:39.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:40 compute-2 sudo[197680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvrrmnrudswvfxxzmepwitwxqrvksaya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398319.3841121-3679-228651657113877/AnsiballZ_file.py'
Nov 29 06:38:40 compute-2 sudo[197680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:40 compute-2 python3.9[197682]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:40 compute-2 sudo[197680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:40 compute-2 ceph-mon[77142]: pgmap v786: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:41 compute-2 sudo[197833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eocowquiqijdiichihiztfmopdvzhyic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398320.7476354-3718-45292324686016/AnsiballZ_command.py'
Nov 29 06:38:41 compute-2 sudo[197833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:41 compute-2 python3.9[197835]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:41 compute-2 sudo[197833]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:41.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:41.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:42 compute-2 sudo[197987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pexztlmkkibkbpydwqbkzcexngnmnoqs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398321.6320364-3742-206930937802484/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:38:42 compute-2 sudo[197987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:42 compute-2 python3[197989]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:38:42 compute-2 sudo[197987]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:43 compute-2 sudo[198139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkiiuhoqotcpgtkcfydopmkwbfonhqll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398322.92053-3765-94026556018657/AnsiballZ_stat.py'
Nov 29 06:38:43 compute-2 sudo[198139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:43 compute-2 python3.9[198141]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:43 compute-2 sudo[198139]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:43.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:43 compute-2 sudo[198217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weamabttwxatvpexhtaiybaxnhlixsdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398322.92053-3765-94026556018657/AnsiballZ_file.py'
Nov 29 06:38:43 compute-2 sudo[198217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:43 compute-2 ceph-mon[77142]: pgmap v787: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:43.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:43 compute-2 python3.9[198219]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:43 compute-2 sudo[198217]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:44 compute-2 sudo[198370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfouvoiwkfivktdofxcroaqesghplhhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398324.3622124-3802-30457514304128/AnsiballZ_stat.py'
Nov 29 06:38:44 compute-2 sudo[198370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:44 compute-2 python3.9[198372]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:44 compute-2 sudo[198370]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:45 compute-2 ceph-mon[77142]: pgmap v788: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:45 compute-2 sudo[198448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idcrxmsjmjpaeimmclwcxmlqrcrpybnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398324.3622124-3802-30457514304128/AnsiballZ_file.py'
Nov 29 06:38:45 compute-2 sudo[198448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:45 compute-2 python3.9[198450]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:45 compute-2 sudo[198448]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:45.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:45 compute-2 sudo[198600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obxsddkpbwbdvqmrbucgeshltosjsrtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398325.5916004-3838-112652172974588/AnsiballZ_stat.py'
Nov 29 06:38:45 compute-2 sudo[198600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:45.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:46 compute-2 python3.9[198602]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:46 compute-2 sudo[198600]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:46 compute-2 sudo[198678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdhavduuhaxywbujikkluoslrvzcyrri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398325.5916004-3838-112652172974588/AnsiballZ_file.py'
Nov 29 06:38:46 compute-2 sudo[198678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:46 compute-2 python3.9[198680]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:46 compute-2 sudo[198678]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:47 compute-2 ceph-mon[77142]: pgmap v789: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:47 compute-2 sudo[198831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzicmdablojxnrbyiubdnhrbgokejzdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398327.1493378-3873-78252648450439/AnsiballZ_stat.py'
Nov 29 06:38:47 compute-2 sudo[198831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:47 compute-2 python3.9[198833]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:47 compute-2 sudo[198831]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:47.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:47 compute-2 sudo[198909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyzsdselveairrryablydrtvvmndmcjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398327.1493378-3873-78252648450439/AnsiballZ_file.py'
Nov 29 06:38:47 compute-2 sudo[198909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:47.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:48 compute-2 python3.9[198911]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:48 compute-2 sudo[198909]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:48 compute-2 ceph-mon[77142]: pgmap v790: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:48 compute-2 sudo[199073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbeajxcofmwjedmsjybyyavbmqpjqbsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398328.4041808-3910-213663633046478/AnsiballZ_stat.py'
Nov 29 06:38:48 compute-2 sudo[199073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:48 compute-2 podman[199036]: 2025-11-29 06:38:48.949626026 +0000 UTC m=+0.106376866 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 29 06:38:49 compute-2 python3.9[199080]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:49 compute-2 sudo[199073]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:49 compute-2 ceph-mon[77142]: pgmap v791: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:49 compute-2 sudo[199163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:49 compute-2 sudo[199163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:49 compute-2 sudo[199163]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:49 compute-2 sudo[199201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:49 compute-2 sudo[199201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:49 compute-2 sudo[199201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:49 compute-2 sudo[199263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltbpvtoehomuzjzuwmreaepgkzwlxqvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398328.4041808-3910-213663633046478/AnsiballZ_copy.py'
Nov 29 06:38:49 compute-2 sudo[199263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:49 compute-2 python3.9[199265]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398328.4041808-3910-213663633046478/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:49 compute-2 sudo[199263]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:38:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:49.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:38:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:49.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:50 compute-2 sudo[199415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxsczhfjsfdsdwqtkjvslitrrcgtluwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398329.9803386-3954-92380413214229/AnsiballZ_file.py'
Nov 29 06:38:50 compute-2 sudo[199415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:50 compute-2 python3.9[199417]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:50 compute-2 sudo[199415]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:50 compute-2 podman[199494]: 2025-11-29 06:38:50.893690236 +0000 UTC m=+0.049553183 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 06:38:51 compute-2 sudo[199587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwmjonmzdszntjjkcycztpkdfnciijxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398330.7632866-3979-66309260230937/AnsiballZ_command.py'
Nov 29 06:38:51 compute-2 sudo[199587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:51 compute-2 python3.9[199589]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:51 compute-2 sudo[199587]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:51 compute-2 ceph-mon[77142]: pgmap v792: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:51.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:51.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:52 compute-2 sudo[199742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdbldlizxbktwryfixjirmwnhnwlmyla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398331.6441603-4003-208447497418765/AnsiballZ_blockinfile.py'
Nov 29 06:38:52 compute-2 sudo[199742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:52 compute-2 python3.9[199744]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:52 compute-2 sudo[199742]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:53 compute-2 sudo[199895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbfmrdptxlqonphhfexrzawyzhzyldma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398332.9614408-4030-147426263112534/AnsiballZ_command.py'
Nov 29 06:38:53 compute-2 sudo[199895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:53 compute-2 python3.9[199897]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:53 compute-2 sudo[199895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:53.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:53 compute-2 ceph-mon[77142]: pgmap v793: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:53.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:54 compute-2 sudo[200048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyvaqvpyjtohgzmubpcpeezsaegaohhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398333.7543938-4054-151864246892240/AnsiballZ_stat.py'
Nov 29 06:38:54 compute-2 sudo[200048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:54 compute-2 python3.9[200050]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:38:54 compute-2 sudo[200048]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:54 compute-2 sudo[200203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjthbbazrvvkotwkqsgpnlregskysgax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398334.5613863-4079-206641418547289/AnsiballZ_command.py'
Nov 29 06:38:54 compute-2 sudo[200203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:55 compute-2 python3.9[200205]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:55 compute-2 sudo[200203]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:55 compute-2 ceph-mon[77142]: pgmap v794: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:55 compute-2 sudo[200358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyxpxejwrpawcbxzunurumhsytcxbugc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398335.3980584-4102-207241753806180/AnsiballZ_file.py'
Nov 29 06:38:55 compute-2 sudo[200358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:55.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:55 compute-2 python3.9[200360]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:55 compute-2 sudo[200358]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:55.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:56 compute-2 sudo[200511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovsuzkizpivoesbmblunnovrxugwoycr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398336.1805663-4126-162989859395076/AnsiballZ_stat.py'
Nov 29 06:38:56 compute-2 sudo[200511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:56 compute-2 python3.9[200513]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:56 compute-2 sudo[200511]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:57 compute-2 sudo[200634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kejqsxowpcozkmedsppybuxnyjzpneik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398336.1805663-4126-162989859395076/AnsiballZ_copy.py'
Nov 29 06:38:57 compute-2 sudo[200634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:57 compute-2 python3.9[200636]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398336.1805663-4126-162989859395076/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:57 compute-2 sudo[200634]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:57 compute-2 ceph-mon[77142]: pgmap v795: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:57.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:57.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:58 compute-2 sudo[200786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clipjrnodppynxnskutqgkuhjqmmdmjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398337.9224632-4170-273631381895449/AnsiballZ_stat.py'
Nov 29 06:38:58 compute-2 sudo[200786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:58 compute-2 python3.9[200788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:58 compute-2 sudo[200786]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:58 compute-2 sudo[200910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zndvversurhfrsmfsbsrggbazurifsej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398337.9224632-4170-273631381895449/AnsiballZ_copy.py'
Nov 29 06:38:58 compute-2 sudo[200910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:59 compute-2 python3.9[200912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398337.9224632-4170-273631381895449/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:59 compute-2 sudo[200910]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:59 compute-2 sudo[201062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpkqnghvissddawmzgxiczwokvkjeued ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398339.3346808-4216-36531274254630/AnsiballZ_stat.py'
Nov 29 06:38:59 compute-2 sudo[201062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:59.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:59 compute-2 python3.9[201064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:59 compute-2 sudo[201062]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:38:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:59.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:00 compute-2 sudo[201185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyxkvjazzohplutbemxwmqevkkbfpmpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398339.3346808-4216-36531274254630/AnsiballZ_copy.py'
Nov 29 06:39:00 compute-2 sudo[201185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:00 compute-2 ceph-mon[77142]: pgmap v796: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:00 compute-2 python3.9[201187]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398339.3346808-4216-36531274254630/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:00 compute-2 sudo[201185]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:01 compute-2 sudo[201338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezokdvkbdwakykzdgrqwgwiicsstigp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398340.990701-4261-50026095618012/AnsiballZ_systemd.py'
Nov 29 06:39:01 compute-2 sudo[201338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:01 compute-2 python3.9[201340]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:01 compute-2 systemd[1]: Reloading.
Nov 29 06:39:01 compute-2 systemd-rc-local-generator[201365]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:01 compute-2 systemd-sysv-generator[201370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:01 compute-2 ceph-mon[77142]: pgmap v797: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:01.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:01.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:01 compute-2 systemd[1]: Reached target edpm_libvirt.target.
Nov 29 06:39:02 compute-2 sudo[201338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:02 compute-2 sudo[201531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gthnmpgbtizncbgwrjrzkjsqwwwoonvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398342.3710868-4285-15406031422092/AnsiballZ_systemd.py'
Nov 29 06:39:02 compute-2 sudo[201531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:02 compute-2 python3.9[201533]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:39:02 compute-2 systemd[1]: Reloading.
Nov 29 06:39:03 compute-2 systemd-rc-local-generator[201558]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:03 compute-2 systemd-sysv-generator[201562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:03 compute-2 ceph-mon[77142]: pgmap v798: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:03 compute-2 systemd[1]: Reloading.
Nov 29 06:39:03 compute-2 systemd-sysv-generator[201602]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:03 compute-2 systemd-rc-local-generator[201598]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:03 compute-2 sudo[201531]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:03.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:03.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:04 compute-2 sshd-session[177977]: Connection closed by 192.168.122.30 port 59886
Nov 29 06:39:04 compute-2 sshd-session[177973]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:39:04 compute-2 systemd[1]: session-49.scope: Deactivated successfully.
Nov 29 06:39:04 compute-2 systemd[1]: session-49.scope: Consumed 1min 27.908s CPU time.
Nov 29 06:39:04 compute-2 systemd-logind[784]: Session 49 logged out. Waiting for processes to exit.
Nov 29 06:39:04 compute-2 systemd-logind[784]: Removed session 49.
Nov 29 06:39:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:05.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:05.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:06 compute-2 ceph-mon[77142]: pgmap v799: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:07 compute-2 ceph-mon[77142]: pgmap v800: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:07 compute-2 sudo[201634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:07 compute-2 sudo[201634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-2 sudo[201634]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-2 sudo[201659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:39:07 compute-2 sudo[201659]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-2 sudo[201659]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-2 sudo[201684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:07 compute-2 sudo[201684]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-2 sudo[201684]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-2 sudo[201709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:39:07 compute-2 sudo[201709]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:07.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:07.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:08 compute-2 sudo[201709]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-2 sudo[201766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:09 compute-2 sudo[201766]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:09 compute-2 sudo[201766]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:09 compute-2 ceph-mon[77142]: pgmap v801: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:09 compute-2 sudo[201791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:09 compute-2 sudo[201791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:09 compute-2 sudo[201791]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:09.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:09.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:10 compute-2 sshd-session[201816]: Accepted publickey for zuul from 192.168.122.30 port 44458 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:39:10 compute-2 systemd-logind[784]: New session 50 of user zuul.
Nov 29 06:39:10 compute-2 systemd[1]: Started Session 50 of User zuul.
Nov 29 06:39:10 compute-2 sshd-session[201816]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:39:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:10 compute-2 python3.9[201970]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:39:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:39:11 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:39:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:11.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:11.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:12 compute-2 ceph-mon[77142]: pgmap v802: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:39:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:39:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:39:12 compute-2 python3.9[202125]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:39:12 compute-2 network[202142]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:39:12 compute-2 network[202143]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:39:12 compute-2 network[202144]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:39:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:13.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:13 compute-2 ceph-mon[77142]: pgmap v803: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:13.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:39:15.130 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:39:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:39:15.131 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:39:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:39:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:39:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:15.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:15 compute-2 ceph-mon[77142]: pgmap v804: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:15.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:17 compute-2 sudo[202416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzfgjxkobbhmcytccytmxbmhiqoldsab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398357.2664394-109-91264514930897/AnsiballZ_setup.py'
Nov 29 06:39:17 compute-2 sudo[202416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:17.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:17 compute-2 python3.9[202418]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:39:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:17.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:18 compute-2 ceph-mon[77142]: pgmap v805: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:18 compute-2 sudo[202416]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:18 compute-2 sudo[202501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugnedluwywkogstqmmydqamtumvihizz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398357.2664394-109-91264514930897/AnsiballZ_dnf.py'
Nov 29 06:39:18 compute-2 sudo[202501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:39:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 2481 writes, 14K keys, 2481 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.03 MB/s
                                           Cumulative WAL: 2481 writes, 2481 syncs, 1.00 writes per sync, written: 0.03 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1115 writes, 4810 keys, 1115 commit groups, 1.0 writes per commit group, ingest: 11.88 MB, 0.02 MB/s
                                           Interval WAL: 1115 writes, 1115 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     23.4      0.73              0.04         5    0.146       0      0       0.0       0.0
                                             L6      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.2     90.1     76.3      0.50              0.10         4    0.125     17K   1774       0.0       0.0
                                            Sum      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   3.2     36.6     44.9      1.23              0.14         9    0.136     17K   1774       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.2     23.9     25.0      1.01              0.06         4    0.252    9503   1038       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     90.1     76.3      0.50              0.10         4    0.125     17K   1774       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     23.4      0.73              0.04         4    0.182       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.017, interval 0.006
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.05 GB write, 0.05 MB/s write, 0.04 GB read, 0.04 MB/s read, 1.2 seconds
                                           Interval compaction: 0.02 GB write, 0.04 MB/s write, 0.02 GB read, 0.04 MB/s read, 1.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 304.00 MB usage: 1.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 8.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(68,1.34 MB,0.441245%) FilterBlock(9,58.36 KB,0.0187472%) IndexBlock(9,128.73 KB,0.0413543%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:39:18 compute-2 python3.9[202503]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:39:19 compute-2 ceph-mon[77142]: pgmap v806: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:19.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:19 compute-2 podman[202505]: 2025-11-29 06:39:19.92931298 +0000 UTC m=+0.084818711 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 06:39:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:19.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:21 compute-2 sudo[202533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:21 compute-2 sudo[202533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:21 compute-2 sudo[202533]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:21 compute-2 podman[202557]: 2025-11-29 06:39:21.474833414 +0000 UTC m=+0.056449605 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:39:21 compute-2 sudo[202564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:39:21 compute-2 sudo[202564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:21 compute-2 sudo[202564]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:21.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:21.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:22 compute-2 ceph-mon[77142]: pgmap v807: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:23 compute-2 ceph-mon[77142]: pgmap v808: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:23.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:23 compute-2 sudo[202501]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:23.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:24 compute-2 sudo[202753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osgoxhzhmjuwbmkmkxkvdxxcfhteycbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398364.5000243-144-138190600634811/AnsiballZ_stat.py'
Nov 29 06:39:24 compute-2 sudo[202753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:25 compute-2 python3.9[202755]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:25 compute-2 sudo[202753]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:25.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:25 compute-2 ceph-mon[77142]: pgmap v809: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:25 compute-2 sudo[202905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqttvgmvwsyxfnvngvrruwhpspyxxjmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398365.4958675-175-143103596549626/AnsiballZ_command.py'
Nov 29 06:39:25 compute-2 sudo[202905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:25.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:26 compute-2 python3.9[202907]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:39:26 compute-2 sudo[202905]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:26 compute-2 sudo[203059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrhjehjtftwipqjgvsciysyksjsewnop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398366.5252576-205-211142345550560/AnsiballZ_stat.py'
Nov 29 06:39:26 compute-2 sudo[203059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:26 compute-2 python3.9[203061]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:27 compute-2 sudo[203059]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:27 compute-2 sudo[203211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fptgsaczebpsyksyxtzvpodoplieuimp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.2204208-229-57646905264226/AnsiballZ_command.py'
Nov 29 06:39:27 compute-2 sudo[203211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:27 compute-2 python3.9[203213]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:39:27 compute-2 sudo[203211]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:27.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:27 compute-2 ceph-mon[77142]: pgmap v810: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:27.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:28 compute-2 sudo[203366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqojtldwgnyhlwkixygoawyqqoejglhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.9435863-253-236948993648395/AnsiballZ_stat.py'
Nov 29 06:39:28 compute-2 sudo[203366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:28 compute-2 sshd-session[203239]: Invalid user admin from 92.118.39.92 port 40668
Nov 29 06:39:28 compute-2 python3.9[203368]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:28 compute-2 sudo[203366]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:28 compute-2 sshd-session[203239]: Connection closed by invalid user admin 92.118.39.92 port 40668 [preauth]
Nov 29 06:39:29 compute-2 sudo[203490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxhgbolszmwccocahjkdqrqqtkyubgoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.9435863-253-236948993648395/AnsiballZ_copy.py'
Nov 29 06:39:29 compute-2 sudo[203490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.069780) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369069859, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1500, "num_deletes": 250, "total_data_size": 3710953, "memory_usage": 3747472, "flush_reason": "Manual Compaction"}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369082086, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1461897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13079, "largest_seqno": 14574, "table_properties": {"data_size": 1457009, "index_size": 2284, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12208, "raw_average_key_size": 20, "raw_value_size": 1446470, "raw_average_value_size": 2402, "num_data_blocks": 103, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398218, "oldest_key_time": 1764398218, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 12350 microseconds, and 5477 cpu microseconds.
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.082137) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1461897 bytes OK
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.082160) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.084550) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.084575) EVENT_LOG_v1 {"time_micros": 1764398369084567, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.084599) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3704043, prev total WAL file size 3704043, number of live WAL files 2.
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.085936) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323533' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1427KB)], [24(10MB)]
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369086028, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 12071632, "oldest_snapshot_seqno": -1}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4613 keys, 9170191 bytes, temperature: kUnknown
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369166938, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 9170191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9137089, "index_size": 20448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11589, "raw_key_size": 112551, "raw_average_key_size": 24, "raw_value_size": 9051483, "raw_average_value_size": 1962, "num_data_blocks": 883, "num_entries": 4613, "num_filter_entries": 4613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.167275) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 9170191 bytes
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.169163) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.0 rd, 113.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.1 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(14.5) write-amplify(6.3) OK, records in: 5066, records dropped: 453 output_compression: NoCompression
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.169200) EVENT_LOG_v1 {"time_micros": 1764398369169183, "job": 12, "event": "compaction_finished", "compaction_time_micros": 81011, "compaction_time_cpu_micros": 38004, "output_level": 6, "num_output_files": 1, "total_output_size": 9170191, "num_input_records": 5066, "num_output_records": 4613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369169900, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369173597, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.085802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-2 python3.9[203492]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398367.9435863-253-236948993648395/.source.iscsi _original_basename=.792wyc72 follow=False checksum=e97d79e3c2fa6d72bfed153b7b0babee6d736c42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:29 compute-2 sudo[203490]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:29 compute-2 sudo[203569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:29 compute-2 sudo[203569]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:29 compute-2 sudo[203569]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:29 compute-2 sudo[203594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:29 compute-2 sudo[203594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:29 compute-2 sudo[203594]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:29.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:29 compute-2 ceph-mon[77142]: pgmap v811: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:29.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:30 compute-2 sudo[203692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgzacoevwggfrozuyqpmnzgphiqilas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398369.5120418-298-132546438447309/AnsiballZ_file.py'
Nov 29 06:39:30 compute-2 sudo[203692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:30 compute-2 python3.9[203694]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:30 compute-2 sudo[203692]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:30 compute-2 sudo[203845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgiyspdbqxddcazsrfyuvxujjhstdmth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398370.4633615-321-192745880612843/AnsiballZ_lineinfile.py'
Nov 29 06:39:30 compute-2 sudo[203845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:31 compute-2 python3.9[203847]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:31 compute-2 sudo[203845]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:31.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:32.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:32 compute-2 ceph-mon[77142]: pgmap v812: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:32 compute-2 sudo[203997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujuufzaqbehfpkesqwdxtmedfisjtbxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398371.5402741-349-59269584686056/AnsiballZ_systemd_service.py'
Nov 29 06:39:32 compute-2 sudo[203997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:32 compute-2 python3.9[203999]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:32 compute-2 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 29 06:39:32 compute-2 sudo[203997]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:33 compute-2 ceph-mon[77142]: pgmap v813: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:33 compute-2 sudo[204154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmoldantjxyoapwcrjdtjqugstuxuznh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398372.8835979-373-226272185174123/AnsiballZ_systemd_service.py'
Nov 29 06:39:33 compute-2 sudo[204154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:33 compute-2 python3.9[204156]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:33 compute-2 systemd[1]: Reloading.
Nov 29 06:39:33 compute-2 systemd-sysv-generator[204190]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:33 compute-2 systemd-rc-local-generator[204186]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:33.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:33 compute-2 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 06:39:33 compute-2 systemd[1]: Starting Open-iSCSI...
Nov 29 06:39:33 compute-2 kernel: Loading iSCSI transport class v2.0-870.
Nov 29 06:39:33 compute-2 systemd[1]: Started Open-iSCSI.
Nov 29 06:39:33 compute-2 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 29 06:39:33 compute-2 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 29 06:39:33 compute-2 sudo[204154]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:34.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:35 compute-2 sudo[204356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgzeijphfxzgyxsobbykwvhvgochfsdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398375.1110883-406-262598038906143/AnsiballZ_service_facts.py'
Nov 29 06:39:35 compute-2 sudo[204356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:35 compute-2 python3.9[204358]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:39:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:35 compute-2 network[204375]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:39:35 compute-2 network[204376]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:39:35 compute-2 network[204377]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:39:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:35.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:36.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:36 compute-2 ceph-mon[77142]: pgmap v814: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:37 compute-2 ceph-mon[77142]: pgmap v815: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:37.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:38.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:39.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:39 compute-2 ceph-mon[77142]: pgmap v816: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:39 compute-2 sudo[204356]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:40.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:40 compute-2 sudo[204650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzamdqtltgfkjtjowtghaoxuxijaqaxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398380.4410424-435-210801575775659/AnsiballZ_file.py'
Nov 29 06:39:40 compute-2 sudo[204650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:40 compute-2 python3.9[204652]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:39:41 compute-2 sudo[204650]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:41.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:41 compute-2 ceph-mon[77142]: pgmap v817: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:42.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:43.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:43 compute-2 ceph-mon[77142]: pgmap v818: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:44.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:45 compute-2 sudo[204804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyydezmzfmobagzoikxaizmlxwoklyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398384.5385299-459-139102921378281/AnsiballZ_modprobe.py'
Nov 29 06:39:45 compute-2 sudo[204804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:45.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:45 compute-2 python3.9[204806]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 29 06:39:45 compute-2 sudo[204804]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:46 compute-2 ceph-mon[77142]: pgmap v819: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:46.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:46 compute-2 sudo[204961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfhidtsuhbzlpcwxbnvatjydxgjcrsww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398386.488407-483-8029253838537/AnsiballZ_stat.py'
Nov 29 06:39:46 compute-2 sudo[204961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:47 compute-2 ceph-mon[77142]: pgmap v820: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:47 compute-2 python3.9[204963]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:47 compute-2 sudo[204961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:47 compute-2 sudo[205084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qymqscsvljryqbstfjmorwmeyxychnpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398386.488407-483-8029253838537/AnsiballZ_copy.py'
Nov 29 06:39:47 compute-2 sudo[205084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:47.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:47 compute-2 python3.9[205086]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398386.488407-483-8029253838537/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:48 compute-2 sudo[205084]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:48.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:39:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 5796 writes, 24K keys, 5796 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5796 writes, 923 syncs, 6.28 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 443 writes, 694 keys, 443 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s
                                           Interval WAL: 443 writes, 211 syncs, 2.10 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:39:49 compute-2 ceph-mon[77142]: pgmap v821: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:49 compute-2 sudo[205210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:49 compute-2 sudo[205210]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:49 compute-2 sudo[205210]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:49 compute-2 sudo[205264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esszerbkzqozhxkidoydtzjpwmugssom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398389.382758-531-98228293291910/AnsiballZ_lineinfile.py'
Nov 29 06:39:49 compute-2 sudo[205264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:49 compute-2 sudo[205262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:49 compute-2 sudo[205262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:49 compute-2 sudo[205262]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:50 compute-2 python3.9[205277]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:50.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:50 compute-2 sudo[205264]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:50 compute-2 podman[205367]: 2025-11-29 06:39:50.960666899 +0000 UTC m=+0.104640272 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:39:51 compute-2 ceph-mon[77142]: pgmap v822: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:51.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:51 compute-2 podman[205397]: 2025-11-29 06:39:51.899788693 +0000 UTC m=+0.061899538 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:39:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:52.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:53 compute-2 ceph-mon[77142]: pgmap v823: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:54.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:55.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:56.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:56 compute-2 ceph-mon[77142]: pgmap v824: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:57 compute-2 sudo[205492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myeiknzolqmkjzoysehvzovtnwytstet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398390.4023461-556-199424925090211/AnsiballZ_systemd.py'
Nov 29 06:39:57 compute-2 sudo[205492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:57 compute-2 ceph-mon[77142]: pgmap v825: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:57 compute-2 python3.9[205494]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:39:57 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 06:39:57 compute-2 systemd[1]: Stopped Load Kernel Modules.
Nov 29 06:39:57 compute-2 systemd[1]: Stopping Load Kernel Modules...
Nov 29 06:39:57 compute-2 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:39:57 compute-2 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:39:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:57.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:57 compute-2 sudo[205492]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:58.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:58 compute-2 sudo[205649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvoxazptdehtnacglfnsyplgoblplxhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398398.4223835-580-232453625117452/AnsiballZ_file.py'
Nov 29 06:39:58 compute-2 sudo[205649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:59 compute-2 python3.9[205651]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:59 compute-2 sudo[205649]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:39:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:59.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:00.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:00 compute-2 ceph-mon[77142]: pgmap v826: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:00 compute-2 sudo[205801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urnkgurjbkdpedmovgqblvjjlrnwquhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398399.982376-608-156615707318048/AnsiballZ_stat.py'
Nov 29 06:40:00 compute-2 sudo[205801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:00 compute-2 python3.9[205803]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:00 compute-2 sudo[205801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:01 compute-2 sudo[205954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jemwvgrkjddcfaguybnhnfpafoefrwcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398400.7880845-634-268793987624997/AnsiballZ_stat.py'
Nov 29 06:40:01 compute-2 sudo[205954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:01 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:40:01 compute-2 ceph-mon[77142]: pgmap v827: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:01 compute-2 python3.9[205956]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:01 compute-2 sudo[205954]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:01.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:02.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:02 compute-2 sudo[206107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvbfsgexazlpsdxqmvwisowhenzwyrwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398401.5194995-658-63452576492630/AnsiballZ_stat.py'
Nov 29 06:40:02 compute-2 sudo[206107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:02 compute-2 python3.9[206109]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:02 compute-2 sudo[206107]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:03 compute-2 sudo[206230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xetinegtsbdvgibkboaygdmhnkrqpezn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398401.5194995-658-63452576492630/AnsiballZ_copy.py'
Nov 29 06:40:03 compute-2 sudo[206230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:03 compute-2 python3.9[206232]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398401.5194995-658-63452576492630/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:03 compute-2 sudo[206230]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:03 compute-2 ceph-mon[77142]: pgmap v828: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:04.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:04 compute-2 sudo[206382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yolxhcknjzbsoexkhlsymkiwbwtwevma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398403.9392607-703-124693311711247/AnsiballZ_command.py'
Nov 29 06:40:04 compute-2 sudo[206382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:04 compute-2 python3.9[206384]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:04 compute-2 sudo[206382]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:05 compute-2 sudo[206536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyttnqyslanzvficxyurtmiwelaqkgcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398404.75054-726-51364101859804/AnsiballZ_lineinfile.py'
Nov 29 06:40:05 compute-2 sudo[206536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:05 compute-2 python3.9[206538]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:05 compute-2 sudo[206536]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:05.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:06.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:06 compute-2 sudo[206689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocleqnczazoggmhnaapmhyqvbdzcssey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398405.5127974-751-198542995134810/AnsiballZ_replace.py'
Nov 29 06:40:06 compute-2 sudo[206689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:06 compute-2 python3.9[206691]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:06 compute-2 sudo[206689]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:06 compute-2 ceph-mon[77142]: pgmap v829: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:07 compute-2 sudo[206841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzobznnotwjfqdmwhvfyemziloivlsgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398407.0756261-776-74796139255638/AnsiballZ_replace.py'
Nov 29 06:40:07 compute-2 sudo[206841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:07 compute-2 python3.9[206843]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:07 compute-2 sudo[206841]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:40:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:07.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:40:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:08.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:08 compute-2 sudo[206993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siaexhpjufispgqtiojoclfmltmihdwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398407.981341-802-72433527541622/AnsiballZ_lineinfile.py'
Nov 29 06:40:08 compute-2 sudo[206993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:08 compute-2 python3.9[206995]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:08 compute-2 sudo[206993]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:08 compute-2 ceph-mon[77142]: pgmap v830: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:08 compute-2 sudo[207146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tidnxdxskhrrepuujxjblchvdricwcms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398408.6482368-802-248704586723101/AnsiballZ_lineinfile.py'
Nov 29 06:40:08 compute-2 sudo[207146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:09 compute-2 python3.9[207148]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:09 compute-2 sudo[207146]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:09 compute-2 sudo[207298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imlpqacimkjclliiutjpdgssnbewwgiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398409.2707324-802-119751957935325/AnsiballZ_lineinfile.py'
Nov 29 06:40:09 compute-2 sudo[207298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:09.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:09 compute-2 sudo[207301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:09 compute-2 sudo[207301]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:09 compute-2 sudo[207301]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:10 compute-2 python3.9[207300]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:10 compute-2 sudo[207298]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:10.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:10 compute-2 sudo[207326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:10 compute-2 sudo[207326]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:10 compute-2 sudo[207326]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:10 compute-2 ceph-mon[77142]: pgmap v831: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:10 compute-2 sudo[207501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mergkiactvnrizuajiozihqpvhuoayay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398410.316759-802-180432142829400/AnsiballZ_lineinfile.py'
Nov 29 06:40:10 compute-2 sudo[207501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:10 compute-2 python3.9[207503]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:10 compute-2 sudo[207501]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:11 compute-2 ceph-mon[77142]: pgmap v832: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:11.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:11 compute-2 sudo[207653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zncnwqunnhcqsaqukvulryuglbenieor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398411.574194-889-262765354157777/AnsiballZ_stat.py'
Nov 29 06:40:11 compute-2 sudo[207653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:12 compute-2 python3.9[207655]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:12 compute-2 sudo[207653]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:12 compute-2 sudo[207808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wosfstsqnqndxyalcrzpupkzrvsynogh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398412.365781-913-256481233152324/AnsiballZ_file.py'
Nov 29 06:40:12 compute-2 sudo[207808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:12 compute-2 python3.9[207810]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:12 compute-2 sudo[207808]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:13 compute-2 ceph-mon[77142]: pgmap v833: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:13.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:13 compute-2 sudo[207960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkdxrmcssynptyjavjlubayffpnibann ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398413.5507357-940-218731022688456/AnsiballZ_file.py'
Nov 29 06:40:13 compute-2 sudo[207960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:14.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:14 compute-2 python3.9[207962]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:14 compute-2 sudo[207960]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:14 compute-2 sudo[208113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atyaawkxfpncpmaibpgosnttbyzbdrmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398414.3891675-964-110081837524995/AnsiballZ_stat.py'
Nov 29 06:40:14 compute-2 sudo[208113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:14 compute-2 python3.9[208115]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:14 compute-2 sudo[208113]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:15 compute-2 sudo[208191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjijhzqopfsveqgguspunuyiqfoyexyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398414.3891675-964-110081837524995/AnsiballZ_file.py'
Nov 29 06:40:15 compute-2 sudo[208191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:40:15.131 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:40:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:40:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:40:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:40:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:40:15 compute-2 python3.9[208193]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:15 compute-2 sudo[208191]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:15 compute-2 sudo[208343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ortyacbxfyfqvnbmjykjlsqfwjkdstzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398415.4473193-964-148118885747180/AnsiballZ_stat.py'
Nov 29 06:40:15 compute-2 sudo[208343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:15.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:15 compute-2 python3.9[208345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:15 compute-2 sudo[208343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:16.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:16 compute-2 sudo[208421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzifococljnkniuylzuicpujyogmvygi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398415.4473193-964-148118885747180/AnsiballZ_file.py'
Nov 29 06:40:16 compute-2 sudo[208421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:16 compute-2 python3.9[208423]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:16 compute-2 sudo[208421]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:16 compute-2 ceph-mon[77142]: pgmap v834: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:17 compute-2 sudo[208574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-damogieshqbbwhhzyzqnhvcjewxomwbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398417.1445034-1034-52534890515406/AnsiballZ_file.py'
Nov 29 06:40:17 compute-2 sudo[208574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:17 compute-2 python3.9[208576]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:17 compute-2 sudo[208574]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:17.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:17 compute-2 ceph-mon[77142]: pgmap v835: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:18.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:18 compute-2 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 29 06:40:18 compute-2 sudo[208727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azilkldoiodimxtnqihfqfuxwiekqiud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398417.9472954-1057-102336268034254/AnsiballZ_stat.py'
Nov 29 06:40:18 compute-2 sudo[208727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:18 compute-2 python3.9[208729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:18 compute-2 sudo[208727]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:18 compute-2 sudo[208806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeeplxtyochsfbyqfqeubkjqezruswpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398417.9472954-1057-102336268034254/AnsiballZ_file.py'
Nov 29 06:40:18 compute-2 sudo[208806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:18 compute-2 python3.9[208808]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:18 compute-2 sudo[208806]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:19 compute-2 ceph-mon[77142]: pgmap v836: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:19 compute-2 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 06:40:19 compute-2 sudo[208959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bepbuawaotqzinkvmshicljsjbowkpwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398419.3333845-1093-144623095601718/AnsiballZ_stat.py'
Nov 29 06:40:19 compute-2 sudo[208959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:19 compute-2 python3.9[208961]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:40:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:19.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:40:19 compute-2 sudo[208959]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:20.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:20 compute-2 sudo[209037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnvjetspuxdepvcbgeelyiilkuyxxbxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398419.3333845-1093-144623095601718/AnsiballZ_file.py'
Nov 29 06:40:20 compute-2 sudo[209037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:20 compute-2 python3.9[209039]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:20 compute-2 sudo[209037]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:20 compute-2 sudo[209190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wepwtwqvzmqkvrjiiycfmynuzipjuyzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398420.6037264-1129-126747225550282/AnsiballZ_systemd.py'
Nov 29 06:40:20 compute-2 sudo[209190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:21 compute-2 python3.9[209192]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:21 compute-2 systemd[1]: Reloading.
Nov 29 06:40:21 compute-2 systemd-rc-local-generator[209239]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:21 compute-2 systemd-sysv-generator[209242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:21 compute-2 podman[209194]: 2025-11-29 06:40:21.313770729 +0000 UTC m=+0.082746184 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 06:40:21 compute-2 sudo[209190]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-2 ceph-mon[77142]: pgmap v837: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:21 compute-2 sudo[209255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:21 compute-2 sudo[209255]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-2 sudo[209255]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-2 sudo[209282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:40:21 compute-2 sudo[209282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-2 sudo[209282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-2 sudo[209331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:21 compute-2 sudo[209331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-2 sudo[209331]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-2 sudo[209356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:40:21 compute-2 sudo[209356]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:21.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:22 compute-2 sudo[209356]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:22.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:22 compute-2 podman[209395]: 2025-11-29 06:40:22.066746235 +0000 UTC m=+0.081157151 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 06:40:22 compute-2 sudo[209494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:22 compute-2 sudo[209494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:22 compute-2 sudo[209494]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-2 sudo[209575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjfotdqqjtayleqlghpcuxtcwmdygrcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398422.1123571-1153-80851537243350/AnsiballZ_stat.py'
Nov 29 06:40:22 compute-2 sudo[209575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:22 compute-2 sudo[209566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:40:22 compute-2 sudo[209566]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:22 compute-2 sudo[209566]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-2 sudo[209598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:22 compute-2 sudo[209598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:22 compute-2 sudo[209598]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-2 sudo[209623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:40:22 compute-2 sudo[209623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:22 compute-2 python3.9[209594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:22 compute-2 sudo[209575]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-2 sudo[209741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wysbmfvitdxzgkuwoqdwklrvcmlryorn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398422.1123571-1153-80851537243350/AnsiballZ_file.py'
Nov 29 06:40:22 compute-2 sudo[209741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:22 compute-2 sudo[209623]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:23 compute-2 python3.9[209745]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:23 compute-2 sudo[209741]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 06:40:23 compute-2 ceph-mon[77142]: pgmap v838: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 06:40:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:40:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:40:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:23.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:24.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:24 compute-2 sudo[209907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpcbeyqcxuwkoscqrkownuvhqljrleei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398423.7569275-1189-171710460743429/AnsiballZ_stat.py'
Nov 29 06:40:24 compute-2 sudo[209907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:24 compute-2 python3.9[209909]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:24 compute-2 sudo[209907]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.342195) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424342236, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 734, "num_deletes": 252, "total_data_size": 1398800, "memory_usage": 1425104, "flush_reason": "Manual Compaction"}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424348967, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 924399, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14579, "largest_seqno": 15308, "table_properties": {"data_size": 920862, "index_size": 1381, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 6914, "raw_average_key_size": 16, "raw_value_size": 913841, "raw_average_value_size": 2191, "num_data_blocks": 63, "num_entries": 417, "num_filter_entries": 417, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398369, "oldest_key_time": 1764398369, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 6825 microseconds, and 2849 cpu microseconds.
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.349017) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 924399 bytes OK
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.349039) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350426) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350441) EVENT_LOG_v1 {"time_micros": 1764398424350437, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350456) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1394898, prev total WAL file size 1394898, number of live WAL files 2.
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.351044) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323533' seq:0, type:0; will stop at (end)
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(902KB)], [27(8955KB)]
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424351105, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 10094590, "oldest_snapshot_seqno": -1}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4513 keys, 9525612 bytes, temperature: kUnknown
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424422897, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 9525612, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9492743, "index_size": 20471, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11333, "raw_key_size": 112202, "raw_average_key_size": 24, "raw_value_size": 9408392, "raw_average_value_size": 2084, "num_data_blocks": 864, "num_entries": 4513, "num_filter_entries": 4513, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.423327) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 9525612 bytes
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.424565) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 140.0 rd, 132.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 8.7 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(21.2) write-amplify(10.3) OK, records in: 5030, records dropped: 517 output_compression: NoCompression
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.424582) EVENT_LOG_v1 {"time_micros": 1764398424424574, "job": 14, "event": "compaction_finished", "compaction_time_micros": 72083, "compaction_time_cpu_micros": 20512, "output_level": 6, "num_output_files": 1, "total_output_size": 9525612, "num_input_records": 5030, "num_output_records": 4513, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424424795, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424426355, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426404) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-2 sudo[209986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upzqawvwpafwstphqxykdvwyonwadbam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398423.7569275-1189-171710460743429/AnsiballZ_file.py'
Nov 29 06:40:24 compute-2 sudo[209986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:24 compute-2 python3.9[209988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:24 compute-2 sudo[209986]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:40:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:40:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:40:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:25 compute-2 sudo[210138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgowthfonhywefhtdmjonxexzcwpwaqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398425.3681865-1225-62705457308845/AnsiballZ_systemd.py'
Nov 29 06:40:25 compute-2 sudo[210138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:25.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:25 compute-2 python3.9[210140]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:25 compute-2 systemd[1]: Reloading.
Nov 29 06:40:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:26.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:26 compute-2 systemd-rc-local-generator[210167]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:26 compute-2 systemd-sysv-generator[210173]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:26 compute-2 systemd[1]: Starting Create netns directory...
Nov 29 06:40:26 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:40:26 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:40:26 compute-2 systemd[1]: Finished Create netns directory.
Nov 29 06:40:26 compute-2 sudo[210138]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:26 compute-2 ceph-mon[77142]: pgmap v839: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:27 compute-2 sudo[210332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuuyrradqtopiwbufcutfoysmtmelmxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398427.1953826-1255-82265066105414/AnsiballZ_file.py'
Nov 29 06:40:27 compute-2 sudo[210332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:27 compute-2 python3.9[210334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:27 compute-2 sudo[210332]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:27.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:28.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:28 compute-2 ceph-mon[77142]: pgmap v840: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:28 compute-2 sudo[210484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wafaacgpifbhbkpceuxgenaduytsvqjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398428.054028-1279-122043633559972/AnsiballZ_stat.py'
Nov 29 06:40:28 compute-2 sudo[210484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:28 compute-2 python3.9[210486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:28 compute-2 sudo[210484]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:29 compute-2 sudo[210608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwsklhowxfkwexqwkbjoynjcarvgcqyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398428.054028-1279-122043633559972/AnsiballZ_copy.py'
Nov 29 06:40:29 compute-2 sudo[210608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:29 compute-2 python3.9[210610]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398428.054028-1279-122043633559972/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:29 compute-2 sudo[210608]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:29.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:30.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:30 compute-2 sudo[210635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:30 compute-2 sudo[210635]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:30 compute-2 sudo[210635]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:30 compute-2 sudo[210660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:30 compute-2 sudo[210660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:30 compute-2 sudo[210660]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:30 compute-2 ceph-mon[77142]: pgmap v841: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:30 compute-2 sudo[210811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocvafpgbzljauzlgbzmtrgwwhcbycemt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398430.343613-1330-146689176695901/AnsiballZ_file.py'
Nov 29 06:40:30 compute-2 sudo[210811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:30 compute-2 python3.9[210813]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:30 compute-2 sudo[210811]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:31 compute-2 sudo[210963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpmmvhkatmzndspxjaostwprxhlerwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398431.1757958-1353-233332167031998/AnsiballZ_stat.py'
Nov 29 06:40:31 compute-2 sudo[210963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:31 compute-2 python3.9[210965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:31 compute-2 sudo[210963]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:31.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:31 compute-2 ceph-mon[77142]: pgmap v842: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:31 compute-2 sudo[211086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phpukriudiguputztgvlwfrntcpnylze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398431.1757958-1353-233332167031998/AnsiballZ_copy.py'
Nov 29 06:40:31 compute-2 sudo[211086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:32.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:32 compute-2 python3.9[211088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398431.1757958-1353-233332167031998/.source.json _original_basename=.qy6x84y7 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:32 compute-2 sudo[211086]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:32 compute-2 sudo[211239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fizvwcvfbqyugnjimqcglbdimsetedna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398432.5504873-1398-114236686197141/AnsiballZ_file.py'
Nov 29 06:40:32 compute-2 sudo[211239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:33 compute-2 python3.9[211241]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:33 compute-2 sudo[211239]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:33 compute-2 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 29 06:40:33 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 29 06:40:33 compute-2 ceph-mon[77142]: pgmap v843: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:33 compute-2 sudo[211393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbriebzyutuvxzgtkbdpkhjrgdxfgmil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398433.4244595-1422-161514401766004/AnsiballZ_stat.py'
Nov 29 06:40:33 compute-2 sudo[211393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:33.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:33 compute-2 sudo[211393]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:34.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:34 compute-2 sudo[211516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vivmoxblhdwmuskiygfroozkuwomnlvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398433.4244595-1422-161514401766004/AnsiballZ_copy.py'
Nov 29 06:40:34 compute-2 sudo[211516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:34 compute-2 sudo[211516]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:35 compute-2 ceph-mon[77142]: pgmap v844: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:35.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:36.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:36 compute-2 sudo[211596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:36 compute-2 sudo[211596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:36 compute-2 sudo[211596]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:36 compute-2 sudo[211621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:40:36 compute-2 sudo[211621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:36 compute-2 sudo[211621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:37 compute-2 sudo[211720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxugculhmchqfxthattcdavvzbsadhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398435.6624978-1474-204929096636031/AnsiballZ_container_config_data.py'
Nov 29 06:40:37 compute-2 sudo[211720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:37 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:37 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:37 compute-2 python3.9[211722]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 29 06:40:37 compute-2 sudo[211720]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:37.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:38.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:38 compute-2 ceph-mon[77142]: pgmap v845: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:38 compute-2 sudo[211872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjdefqanhbkwqlpbsgcatqafctxaguyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398437.7401102-1501-203288423552941/AnsiballZ_container_config_hash.py'
Nov 29 06:40:38 compute-2 sudo[211872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:38 compute-2 python3.9[211874]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:40:38 compute-2 sudo[211872]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:39 compute-2 ceph-mon[77142]: pgmap v846: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:39 compute-2 sudo[212025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkbbeirdwlykpenveuvozlqchwqvregc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398438.9501314-1528-81286697383267/AnsiballZ_podman_container_info.py'
Nov 29 06:40:39 compute-2 sudo[212025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:39 compute-2 python3.9[212027]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:40:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:39 compute-2 sudo[212025]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:40.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:41 compute-2 ceph-mon[77142]: pgmap v847: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:41 compute-2 sudo[212205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmvcqfhvwirxwlygjrtgsqjbdljoilwf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398441.132798-1566-222479547168790/AnsiballZ_edpm_container_manage.py'
Nov 29 06:40:41 compute-2 sudo[212205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:41.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:42 compute-2 python3[212207]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:40:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:42.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:43 compute-2 podman[212220]: 2025-11-29 06:40:43.243714791 +0000 UTC m=+1.154205205 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:40:43 compute-2 podman[212278]: 2025-11-29 06:40:43.366130039 +0000 UTC m=+0.043803887 container create d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:40:43 compute-2 podman[212278]: 2025-11-29 06:40:43.341922429 +0000 UTC m=+0.019596307 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:40:43 compute-2 python3[212207]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:40:43 compute-2 sudo[212205]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:43.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:44.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:44 compute-2 ceph-mon[77142]: pgmap v848: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:45 compute-2 ceph-mon[77142]: pgmap v849: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:45.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:46.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:46 compute-2 sudo[212468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mawqzybvlymxmcdllbyhifghhhmyvszm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398446.140557-1591-13198245969708/AnsiballZ_stat.py'
Nov 29 06:40:46 compute-2 sudo[212468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:46 compute-2 python3.9[212470]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:46 compute-2 sudo[212468]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:47 compute-2 sudo[212622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvcdmxncunvqpvpjqxxudqnmgpzkspjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398447.0423448-1617-34401716546264/AnsiballZ_file.py'
Nov 29 06:40:47 compute-2 sudo[212622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:47.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:47 compute-2 python3.9[212624]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:47 compute-2 sudo[212622]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:47 compute-2 ceph-mon[77142]: pgmap v850: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:48.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:48 compute-2 sudo[212698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykkikdslvsfcyutouedpbgpsgvmrvlbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398447.0423448-1617-34401716546264/AnsiballZ_stat.py'
Nov 29 06:40:48 compute-2 sudo[212698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:48 compute-2 python3.9[212700]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:48 compute-2 sudo[212698]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:48 compute-2 sudo[212850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spgksrnoztgyaestnffpqdalhexibdxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398448.3843925-1617-262441756147483/AnsiballZ_copy.py'
Nov 29 06:40:48 compute-2 sudo[212850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:49 compute-2 python3.9[212852]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398448.3843925-1617-262441756147483/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:49 compute-2 sudo[212850]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:49 compute-2 sudo[212926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkdpuifnqgyzfndmyjyghoyfoertgrci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398448.3843925-1617-262441756147483/AnsiballZ_systemd.py'
Nov 29 06:40:49 compute-2 sudo[212926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:49 compute-2 ceph-mon[77142]: pgmap v851: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:49 compute-2 python3.9[212928]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:40:49 compute-2 systemd[1]: Reloading.
Nov 29 06:40:49 compute-2 systemd-rc-local-generator[212950]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:49 compute-2 systemd-sysv-generator[212956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:49.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:50 compute-2 sudo[212926]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:50.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:50 compute-2 sudo[213036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkvqrjmowgyvidxtirgqvdkjwjrwrjxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398448.3843925-1617-262441756147483/AnsiballZ_systemd.py'
Nov 29 06:40:50 compute-2 sudo[213036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:50 compute-2 sudo[213039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:50 compute-2 sudo[213039]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:50 compute-2 sudo[213039]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:50 compute-2 sudo[213064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:50 compute-2 sudo[213064]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:50 compute-2 sudo[213064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:50 compute-2 python3.9[213038]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:51 compute-2 ceph-mon[77142]: pgmap v852: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:51 compute-2 systemd[1]: Reloading.
Nov 29 06:40:51 compute-2 systemd-sysv-generator[213144]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:51 compute-2 systemd-rc-local-generator[213140]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:51 compute-2 podman[213092]: 2025-11-29 06:40:51.762633116 +0000 UTC m=+0.162369952 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:40:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:51.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:51 compute-2 systemd[1]: Starting multipathd container...
Nov 29 06:40:52 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:40:52 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:52 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:52.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:52 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.
Nov 29 06:40:52 compute-2 podman[213156]: 2025-11-29 06:40:52.103205934 +0000 UTC m=+0.113437408 container init d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 06:40:52 compute-2 multipathd[213172]: + sudo -E kolla_set_configs
Nov 29 06:40:52 compute-2 sudo[213188]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:40:52 compute-2 sudo[213188]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:52 compute-2 sudo[213188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:52 compute-2 podman[213156]: 2025-11-29 06:40:52.138159363 +0000 UTC m=+0.148390747 container start d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:40:52 compute-2 podman[213156]: multipathd
Nov 29 06:40:52 compute-2 systemd[1]: Started multipathd container.
Nov 29 06:40:52 compute-2 podman[213175]: 2025-11-29 06:40:52.15963911 +0000 UTC m=+0.064132634 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 06:40:52 compute-2 multipathd[213172]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:40:52 compute-2 multipathd[213172]: INFO:__main__:Validating config file
Nov 29 06:40:52 compute-2 multipathd[213172]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:40:52 compute-2 multipathd[213172]: INFO:__main__:Writing out command to execute
Nov 29 06:40:52 compute-2 sudo[213188]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:52 compute-2 sudo[213036]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:52 compute-2 multipathd[213172]: ++ cat /run_command
Nov 29 06:40:52 compute-2 multipathd[213172]: + CMD='/usr/sbin/multipathd -d'
Nov 29 06:40:52 compute-2 multipathd[213172]: + ARGS=
Nov 29 06:40:52 compute-2 multipathd[213172]: + sudo kolla_copy_cacerts
Nov 29 06:40:52 compute-2 sudo[213216]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:40:52 compute-2 sudo[213216]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:52 compute-2 sudo[213216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:52 compute-2 podman[213192]: 2025-11-29 06:40:52.20765212 +0000 UTC m=+0.058142813 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:40:52 compute-2 sudo[213216]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:52 compute-2 multipathd[213172]: + [[ ! -n '' ]]
Nov 29 06:40:52 compute-2 multipathd[213172]: + . kolla_extend_start
Nov 29 06:40:52 compute-2 multipathd[213172]: Running command: '/usr/sbin/multipathd -d'
Nov 29 06:40:52 compute-2 multipathd[213172]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 06:40:52 compute-2 multipathd[213172]: + umask 0022
Nov 29 06:40:52 compute-2 multipathd[213172]: + exec /usr/sbin/multipathd -d
Nov 29 06:40:52 compute-2 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-653d29d05bd51575.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:40:52 compute-2 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-653d29d05bd51575.service: Failed with result 'exit-code'.
Nov 29 06:40:52 compute-2 multipathd[213172]: 3920.880617 | --------start up--------
Nov 29 06:40:52 compute-2 multipathd[213172]: 3920.880638 | read /etc/multipath.conf
Nov 29 06:40:52 compute-2 multipathd[213172]: 3920.885709 | path checkers start up
Nov 29 06:40:52 compute-2 python3.9[213379]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:53 compute-2 sudo[213531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pimadfpvfwckyiegwornecnmgngrushp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398453.2760887-1726-43722196238200/AnsiballZ_command.py'
Nov 29 06:40:53 compute-2 sudo[213531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:53 compute-2 python3.9[213533]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:53 compute-2 sudo[213531]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:53.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:54.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:54 compute-2 ceph-mon[77142]: pgmap v853: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:54 compute-2 sudo[213697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phnbnoqpsbyuwouuwtwkwtqykfpueobr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398454.1744967-1749-242548239464767/AnsiballZ_systemd.py'
Nov 29 06:40:54 compute-2 sudo[213697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:54 compute-2 python3.9[213699]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:40:54 compute-2 systemd[1]: Stopping multipathd container...
Nov 29 06:40:54 compute-2 multipathd[213172]: 3923.480283 | exit (signal)
Nov 29 06:40:54 compute-2 multipathd[213172]: 3923.480976 | --------shut down-------
Nov 29 06:40:54 compute-2 systemd[1]: libpod-d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.scope: Deactivated successfully.
Nov 29 06:40:54 compute-2 conmon[213172]: conmon d45765539066b12c036d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.scope/container/memory.events
Nov 29 06:40:54 compute-2 podman[213703]: 2025-11-29 06:40:54.855345293 +0000 UTC m=+0.066770675 container died d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:40:54 compute-2 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-653d29d05bd51575.timer: Deactivated successfully.
Nov 29 06:40:54 compute-2 systemd[1]: Stopped /usr/bin/podman healthcheck run d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.
Nov 29 06:40:54 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:40:54 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-userdata-shm.mount: Deactivated successfully.
Nov 29 06:40:54 compute-2 systemd[1]: var-lib-containers-storage-overlay-ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea-merged.mount: Deactivated successfully.
Nov 29 06:40:55 compute-2 podman[213703]: 2025-11-29 06:40:55.304625461 +0000 UTC m=+0.516050883 container cleanup d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 06:40:55 compute-2 podman[213703]: multipathd
Nov 29 06:40:55 compute-2 podman[213731]: multipathd
Nov 29 06:40:55 compute-2 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 29 06:40:55 compute-2 systemd[1]: Stopped multipathd container.
Nov 29 06:40:55 compute-2 systemd[1]: Starting multipathd container...
Nov 29 06:40:55 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:40:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:55 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.
Nov 29 06:40:55 compute-2 podman[213744]: 2025-11-29 06:40:55.504448918 +0000 UTC m=+0.098471956 container init d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:40:55 compute-2 multipathd[213759]: + sudo -E kolla_set_configs
Nov 29 06:40:55 compute-2 sudo[213765]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:40:55 compute-2 sudo[213765]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:55 compute-2 sudo[213765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:55 compute-2 podman[213744]: 2025-11-29 06:40:55.537300381 +0000 UTC m=+0.131323389 container start d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS)
Nov 29 06:40:55 compute-2 podman[213744]: multipathd
Nov 29 06:40:55 compute-2 systemd[1]: Started multipathd container.
Nov 29 06:40:55 compute-2 multipathd[213759]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:40:55 compute-2 multipathd[213759]: INFO:__main__:Validating config file
Nov 29 06:40:55 compute-2 multipathd[213759]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:40:55 compute-2 multipathd[213759]: INFO:__main__:Writing out command to execute
Nov 29 06:40:55 compute-2 sudo[213765]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:55 compute-2 sudo[213697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:55 compute-2 multipathd[213759]: ++ cat /run_command
Nov 29 06:40:55 compute-2 multipathd[213759]: + CMD='/usr/sbin/multipathd -d'
Nov 29 06:40:55 compute-2 multipathd[213759]: + ARGS=
Nov 29 06:40:55 compute-2 multipathd[213759]: + sudo kolla_copy_cacerts
Nov 29 06:40:55 compute-2 sudo[213781]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:40:55 compute-2 sudo[213781]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:55 compute-2 sudo[213781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:55 compute-2 sudo[213781]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:55 compute-2 multipathd[213759]: + [[ ! -n '' ]]
Nov 29 06:40:55 compute-2 multipathd[213759]: + . kolla_extend_start
Nov 29 06:40:55 compute-2 multipathd[213759]: Running command: '/usr/sbin/multipathd -d'
Nov 29 06:40:55 compute-2 multipathd[213759]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 06:40:55 compute-2 multipathd[213759]: + umask 0022
Nov 29 06:40:55 compute-2 multipathd[213759]: + exec /usr/sbin/multipathd -d
Nov 29 06:40:55 compute-2 multipathd[213759]: 3924.311146 | --------start up--------
Nov 29 06:40:55 compute-2 multipathd[213759]: 3924.311194 | read /etc/multipath.conf
Nov 29 06:40:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:55 compute-2 podman[213766]: 2025-11-29 06:40:55.655789244 +0000 UTC m=+0.109426891 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:40:55 compute-2 multipathd[213759]: 3924.316137 | path checkers start up
Nov 29 06:40:55 compute-2 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-62a33ed563dbe54a.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:40:55 compute-2 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-62a33ed563dbe54a.service: Failed with result 'exit-code'.
Nov 29 06:40:55 compute-2 ceph-mon[77142]: pgmap v854: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:55.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:56.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:57.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:58 compute-2 ceph-mon[77142]: pgmap v855: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:58 compute-2 sudo[213949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swwigsxdnnpzipcrqabmxhxoqdopjusa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398458.197257-1774-46448003842337/AnsiballZ_file.py'
Nov 29 06:40:58 compute-2 sudo[213949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:58 compute-2 python3.9[213951]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:58 compute-2 sudo[213949]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:59 compute-2 sudo[214101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugjejdbkdzsqzopzlhaumfpfdoqrbaqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398459.4990795-1810-244700037647549/AnsiballZ_file.py'
Nov 29 06:40:59 compute-2 sudo[214101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:40:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:59.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:00 compute-2 python3.9[214103]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:41:00 compute-2 sudo[214101]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:00.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:00 compute-2 sudo[214254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svsiuckncitlqcyawrhcomufixpqmwve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398460.3455217-1833-260359527460656/AnsiballZ_modprobe.py'
Nov 29 06:41:00 compute-2 sudo[214254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:00 compute-2 ceph-mon[77142]: pgmap v856: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:00 compute-2 python3.9[214256]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 29 06:41:00 compute-2 kernel: Key type psk registered
Nov 29 06:41:00 compute-2 sudo[214254]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:01 compute-2 sudo[214417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwzzudcsofsbytiovppafgpfwxduyiut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398461.2045333-1857-149618396532093/AnsiballZ_stat.py'
Nov 29 06:41:01 compute-2 sudo[214417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:01 compute-2 python3.9[214419]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:41:01 compute-2 sudo[214417]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:01.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:02 compute-2 ceph-mon[77142]: pgmap v857: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:02.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:02 compute-2 sudo[214540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snznqvaxpczxygntuwvgfrsglrxnwlxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398461.2045333-1857-149618396532093/AnsiballZ_copy.py'
Nov 29 06:41:02 compute-2 sudo[214540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:02 compute-2 python3.9[214542]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398461.2045333-1857-149618396532093/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:02 compute-2 sudo[214540]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:03 compute-2 sudo[214693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvwozaglfjrlpjtuhzsrtplmciadyjyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398462.925315-1906-222368069886560/AnsiballZ_lineinfile.py'
Nov 29 06:41:03 compute-2 sudo[214693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:03 compute-2 python3.9[214695]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:03 compute-2 sudo[214693]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:03.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:04 compute-2 ceph-mon[77142]: pgmap v858: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 4.8 KiB/s rd, 0 B/s wr, 7 op/s
Nov 29 06:41:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:04.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:04 compute-2 sudo[214845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pojkttocmglboubaopwlombgdzdopqbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398463.7731934-1929-83023193666567/AnsiballZ_systemd.py'
Nov 29 06:41:04 compute-2 sudo[214845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:04 compute-2 python3.9[214847]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:41:04 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 06:41:04 compute-2 systemd[1]: Stopped Load Kernel Modules.
Nov 29 06:41:04 compute-2 systemd[1]: Stopping Load Kernel Modules...
Nov 29 06:41:04 compute-2 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:41:04 compute-2 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:41:04 compute-2 sudo[214845]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:05 compute-2 sudo[215002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwggyfprdmvedyrnhfxdarlnpdipgiij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398464.8971765-1954-61622048224001/AnsiballZ_dnf.py'
Nov 29 06:41:05 compute-2 sudo[215002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:05 compute-2 python3.9[215004]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:41:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:05 compute-2 ceph-mon[77142]: pgmap v859: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Nov 29 06:41:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:05.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:06.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:07 compute-2 ceph-mon[77142]: pgmap v860: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 80 KiB/s rd, 0 B/s wr, 132 op/s
Nov 29 06:41:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:07.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:08.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:09 compute-2 systemd[1]: Reloading.
Nov 29 06:41:09 compute-2 systemd-sysv-generator[215039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:09 compute-2 systemd-rc-local-generator[215032]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:09 compute-2 systemd[1]: Reloading.
Nov 29 06:41:09 compute-2 systemd-sysv-generator[215074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:09 compute-2 systemd-rc-local-generator[215068]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:09.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:10.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:10 compute-2 systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 06:41:10 compute-2 systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 06:41:10 compute-2 sudo[215129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:10 compute-2 sudo[215129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:10 compute-2 sudo[215129]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:10 compute-2 lvm[215119]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:41:10 compute-2 lvm[215119]: VG ceph_vg0 finished
Nov 29 06:41:10 compute-2 sudo[215156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:10 compute-2 sudo[215156]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:10 compute-2 sudo[215156]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:10 compute-2 ceph-mon[77142]: pgmap v861: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 178 op/s
Nov 29 06:41:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:10 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:41:10 compute-2 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:41:10 compute-2 systemd[1]: Reloading.
Nov 29 06:41:10 compute-2 systemd-sysv-generator[215226]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:10 compute-2 systemd-rc-local-generator[215223]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:11 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:41:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:11.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:12.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:12 compute-2 ceph-mon[77142]: pgmap v862: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 178 op/s
Nov 29 06:41:13 compute-2 sudo[215002]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:13 compute-2 ceph-mon[77142]: pgmap v863: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 178 op/s
Nov 29 06:41:13 compute-2 sudo[216511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzrspjssoosfwpzroiymgkuajscoodza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398473.3480403-1978-269625844723975/AnsiballZ_systemd_service.py'
Nov 29 06:41:13 compute-2 sudo[216511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:13 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:41:13 compute-2 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:41:13 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.738s CPU time.
Nov 29 06:41:13 compute-2 systemd[1]: run-re34cc9af837248f7a074ad6d47e83fba.service: Deactivated successfully.
Nov 29 06:41:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:13.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:13 compute-2 python3.9[216513]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:41:13 compute-2 systemd[1]: Stopping Open-iSCSI...
Nov 29 06:41:13 compute-2 iscsid[204197]: iscsid shutting down.
Nov 29 06:41:13 compute-2 systemd[1]: iscsid.service: Deactivated successfully.
Nov 29 06:41:13 compute-2 systemd[1]: Stopped Open-iSCSI.
Nov 29 06:41:13 compute-2 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 06:41:13 compute-2 systemd[1]: Starting Open-iSCSI...
Nov 29 06:41:13 compute-2 systemd[1]: Started Open-iSCSI.
Nov 29 06:41:14 compute-2 sudo[216511]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:14.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:15 compute-2 python3.9[216669]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:41:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:41:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:41:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:41:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:41:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:41:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:41:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:15.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:15 compute-2 sudo[216823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nclncsanrewmhaehomnbcgktgcymxwwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398475.6327126-2030-184743925481569/AnsiballZ_file.py'
Nov 29 06:41:15 compute-2 sudo[216823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:16 compute-2 python3.9[216825]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:16.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:16 compute-2 sudo[216823]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:16 compute-2 ceph-mon[77142]: pgmap v864: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 102 KiB/s rd, 0 B/s wr, 170 op/s
Nov 29 06:41:17 compute-2 sudo[216976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twukvvijhtwbmwlsflsgdkpwyjmyuzfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398476.842099-2063-148261723642957/AnsiballZ_systemd_service.py'
Nov 29 06:41:17 compute-2 sudo[216976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:17 compute-2 python3.9[216978]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:41:17 compute-2 systemd[1]: Reloading.
Nov 29 06:41:17 compute-2 systemd-sysv-generator[217008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:17 compute-2 systemd-rc-local-generator[217003]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:17 compute-2 ceph-mon[77142]: pgmap v865: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 71 KiB/s rd, 0 B/s wr, 119 op/s
Nov 29 06:41:17 compute-2 sudo[216976]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:17.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:18.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:18 compute-2 python3.9[217162]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:41:18 compute-2 network[217180]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:41:18 compute-2 network[217181]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:41:18 compute-2 network[217182]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:41:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:19.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:20.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:20 compute-2 ceph-mon[77142]: pgmap v866: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 27 KiB/s rd, 0 B/s wr, 45 op/s
Nov 29 06:41:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:41:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:21.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:41:22 compute-2 ceph-mon[77142]: pgmap v867: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:22 compute-2 podman[217269]: 2025-11-29 06:41:22.093336095 +0000 UTC m=+0.094475376 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 06:41:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:22.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:22 compute-2 podman[217306]: 2025-11-29 06:41:22.254966832 +0000 UTC m=+0.048799851 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 06:41:23 compute-2 sudo[217502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egdyxzvtwuoaxnhxqotlresoubcvgwcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398483.1783583-2120-224344773079236/AnsiballZ_systemd_service.py'
Nov 29 06:41:23 compute-2 sudo[217502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:23 compute-2 python3.9[217504]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:23 compute-2 sudo[217502]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:23.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:24 compute-2 ceph-mon[77142]: pgmap v868: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:24.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:24 compute-2 sudo[217655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yujeoktffanwmnsjqxiuoxyhmwfrabqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398483.9074957-2120-116661698227250/AnsiballZ_systemd_service.py'
Nov 29 06:41:24 compute-2 sudo[217655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:24 compute-2 python3.9[217657]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:24 compute-2 sudo[217655]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:24 compute-2 sudo[217809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsqofpjyduioxuxigotljtafcsfazymq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398484.6191795-2120-56927812257961/AnsiballZ_systemd_service.py'
Nov 29 06:41:24 compute-2 sudo[217809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:25 compute-2 python3.9[217811]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:25 compute-2 sudo[217809]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:25 compute-2 sudo[217962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psaszkiawimolvaskhtlilbdzzysbuoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398485.3800392-2120-106936920147913/AnsiballZ_systemd_service.py'
Nov 29 06:41:25 compute-2 sudo[217962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:25 compute-2 ceph-mon[77142]: pgmap v869: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:25 compute-2 podman[217965]: 2025-11-29 06:41:25.889760813 +0000 UTC m=+0.059776265 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:41:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:25 compute-2 python3.9[217964]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:25 compute-2 sudo[217962]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:26.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:26 compute-2 sudo[218136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyryetgtvonikuknysjhbuxmnyfckpzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398486.1367393-2120-112194466805880/AnsiballZ_systemd_service.py'
Nov 29 06:41:26 compute-2 sudo[218136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:26 compute-2 python3.9[218139]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:26 compute-2 sudo[218136]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:27 compute-2 sudo[218290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzxrttkzhqwgskpepqwbbpunrpjxcgjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398486.8667595-2120-15978266182543/AnsiballZ_systemd_service.py'
Nov 29 06:41:27 compute-2 sudo[218290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:27 compute-2 python3.9[218292]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:27 compute-2 sudo[218290]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:27 compute-2 sudo[218443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cscgapwyrhfnlufomypwzimoaczuyfzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398487.5845525-2120-55661286929639/AnsiballZ_systemd_service.py'
Nov 29 06:41:27 compute-2 sudo[218443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:27.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:28 compute-2 python3.9[218445]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:28.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:28 compute-2 sudo[218443]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:28 compute-2 ceph-mon[77142]: pgmap v870: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:28 compute-2 sudo[218597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxrabivcskaxubovfyelelrlbcklxzyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398488.2699776-2120-16792965391472/AnsiballZ_systemd_service.py'
Nov 29 06:41:28 compute-2 sudo[218597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:28 compute-2 python3.9[218599]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:28 compute-2 sudo[218597]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:29.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:30.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:30 compute-2 sudo[218751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyukpxtajmhpqgqjvnmcvxjznllwabva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398490.3126957-2297-82662763033187/AnsiballZ_file.py'
Nov 29 06:41:30 compute-2 sudo[218751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:30 compute-2 sudo[218754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:30 compute-2 sudo[218754]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:30 compute-2 sudo[218754]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:30 compute-2 sudo[218779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:30 compute-2 sudo[218779]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:30 compute-2 sudo[218779]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:30 compute-2 python3.9[218753]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:30 compute-2 sudo[218751]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:30 compute-2 ceph-mon[77142]: pgmap v871: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:31 compute-2 sudo[218953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grsjclaujrjojvsqohlwpdygcvyrbysd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398490.9221835-2297-114484171340950/AnsiballZ_file.py'
Nov 29 06:41:31 compute-2 sudo[218953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:31 compute-2 python3.9[218955]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:31 compute-2 sudo[218953]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:31 compute-2 sudo[219105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zobggqdhbcmtomwibhpassxrrfsuuntf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398491.486463-2297-3338167774779/AnsiballZ_file.py'
Nov 29 06:41:31 compute-2 sudo[219105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:31.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:32 compute-2 python3.9[219107]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:32 compute-2 sudo[219105]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:32.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:32 compute-2 ceph-mon[77142]: pgmap v872: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:32 compute-2 sudo[219258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpudywksnjazpagnaunzrpsukkxbmxmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398492.1567783-2297-7978589322040/AnsiballZ_file.py'
Nov 29 06:41:32 compute-2 sudo[219258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:32 compute-2 python3.9[219260]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:32 compute-2 sudo[219258]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:33 compute-2 sudo[219410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goygtlhcriifsgoezcxyjfxlggdapqvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398492.769775-2297-111470531487609/AnsiballZ_file.py'
Nov 29 06:41:33 compute-2 sudo[219410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:33 compute-2 python3.9[219412]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:33 compute-2 sudo[219410]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:33 compute-2 ceph-mon[77142]: pgmap v873: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:33 compute-2 sudo[219562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzkgpefbujpsxkhzaqzfsizclclzqvlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398493.3444767-2297-264234550131929/AnsiballZ_file.py'
Nov 29 06:41:33 compute-2 sudo[219562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:33 compute-2 python3.9[219564]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:33 compute-2 sudo[219562]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:41:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:33.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:41:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:34.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:34 compute-2 sudo[219714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpakvtojqksvvpfwnkmlkeijuafdhlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398494.0082076-2297-37781899137382/AnsiballZ_file.py'
Nov 29 06:41:34 compute-2 sudo[219714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:34 compute-2 python3.9[219716]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:34 compute-2 sudo[219714]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:35 compute-2 sudo[219867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsujxlgaukwhilvsdsxuoitswkrmqflc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398494.6585069-2297-217761400433563/AnsiballZ_file.py'
Nov 29 06:41:35 compute-2 sudo[219867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:35 compute-2 python3.9[219869]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:35 compute-2 sudo[219867]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:35 compute-2 ceph-mon[77142]: pgmap v874: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:35.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:36.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:36 compute-2 sudo[219894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:36 compute-2 sudo[219894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:36 compute-2 sudo[219894]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:36 compute-2 sudo[219919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:41:36 compute-2 sudo[219919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:36 compute-2 sudo[219919]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:36 compute-2 sudo[219945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:36 compute-2 sudo[219945]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:36 compute-2 sudo[219945]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:36 compute-2 sudo[219970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:41:36 compute-2 sudo[219970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:37 compute-2 podman[220143]: 2025-11-29 06:41:37.10774482 +0000 UTC m=+0.059751704 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef)
Nov 29 06:41:37 compute-2 sudo[220213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dljesrgljvogjjxrebiyzpndqzhowjgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398496.911527-2468-29761635874560/AnsiballZ_file.py'
Nov 29 06:41:37 compute-2 sudo[220213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:37 compute-2 podman[220143]: 2025-11-29 06:41:37.220358652 +0000 UTC m=+0.172365546 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:41:37 compute-2 python3.9[220215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:37 compute-2 sudo[220213]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:37 compute-2 sudo[220502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkzdoacquxjkpdoeooujlfcgigksstmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398497.5259507-2468-83947892830160/AnsiballZ_file.py'
Nov 29 06:41:37 compute-2 sudo[220502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:37 compute-2 podman[220500]: 2025-11-29 06:41:37.816370754 +0000 UTC m=+0.058091019 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:41:37 compute-2 podman[220500]: 2025-11-29 06:41:37.826350432 +0000 UTC m=+0.068070707 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:41:37 compute-2 ceph-mon[77142]: pgmap v875: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:37 compute-2 python3.9[220513]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:37.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:37 compute-2 sudo[220502]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-2 podman[220566]: 2025-11-29 06:41:38.018485358 +0000 UTC m=+0.042828081 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, release=1793, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, name=keepalived, description=keepalived for Ceph, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.buildah.version=1.28.2, architecture=x86_64)
Nov 29 06:41:38 compute-2 podman[220566]: 2025-11-29 06:41:38.032340219 +0000 UTC m=+0.056682942 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, description=keepalived for Ceph, io.buildah.version=1.28.2, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, release=1793, vcs-type=git, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public)
Nov 29 06:41:38 compute-2 sudo[219970]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:38.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:38 compute-2 sudo[220679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:38 compute-2 sudo[220679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:38 compute-2 sudo[220679]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-2 sudo[220722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:41:38 compute-2 sudo[220722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:38 compute-2 sudo[220722]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-2 sudo[220774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:38 compute-2 sudo[220774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:38 compute-2 sudo[220820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkjzhcawambjwrooyefvavjqvdhcfumz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398498.0906174-2468-204433609769985/AnsiballZ_file.py'
Nov 29 06:41:38 compute-2 sudo[220774]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-2 sudo[220820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:38 compute-2 sudo[220825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:41:38 compute-2 sudo[220825]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:38 compute-2 python3.9[220824]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:38 compute-2 sudo[220820]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-2 sudo[220825]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-2 sudo[221030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbtaxkuostwiydeedrubipcdmuuwuzey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398498.7355368-2468-204448936205589/AnsiballZ_file.py'
Nov 29 06:41:38 compute-2 sudo[221030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:39 compute-2 python3.9[221032]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:39 compute-2 sudo[221030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:39 compute-2 ceph-mon[77142]: pgmap v876: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:41:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:41:39 compute-2 sudo[221182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szcrcdzjqbcybxvymodacydjeopngulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398499.3146288-2468-104784567942169/AnsiballZ_file.py'
Nov 29 06:41:39 compute-2 sudo[221182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:39 compute-2 python3.9[221184]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:39 compute-2 sudo[221182]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:40.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:40 compute-2 sudo[221336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcezdnpinfognbhkmxetjbobjdcregbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398499.963215-2468-82276600587271/AnsiballZ_file.py'
Nov 29 06:41:40 compute-2 sudo[221336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:40 compute-2 python3.9[221338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:40 compute-2 sudo[221336]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:40 compute-2 sshd-session[221284]: Invalid user ubuntu from 92.118.39.92 port 34058
Nov 29 06:41:40 compute-2 sshd-session[221284]: Connection closed by invalid user ubuntu 92.118.39.92 port 34058 [preauth]
Nov 29 06:41:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:40 compute-2 sudo[221489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyyaykwcymutzfhfalsrjrkxcaipgffb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398500.6246371-2468-23489674491596/AnsiballZ_file.py'
Nov 29 06:41:40 compute-2 sudo[221489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:41 compute-2 python3.9[221491]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:41 compute-2 sudo[221489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:41 compute-2 ceph-mon[77142]: pgmap v877: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:41 compute-2 sudo[221641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axkqzfdlxqfdbknogckeeedyixlafsjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398501.2228584-2468-271862028927328/AnsiballZ_file.py'
Nov 29 06:41:41 compute-2 sudo[221641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:41 compute-2 python3.9[221643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:41 compute-2 sudo[221641]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:41.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:42.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:43 compute-2 sudo[221794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxanajhcylredbvuowuvdtejkakcnzqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398503.0959632-2643-242727298686423/AnsiballZ_command.py'
Nov 29 06:41:43 compute-2 sudo[221794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:43 compute-2 python3.9[221796]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:43 compute-2 sudo[221794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:43 compute-2 ceph-mon[77142]: pgmap v878: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:43.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:44.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:44 compute-2 python3.9[221948]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:41:45 compute-2 sudo[222099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtpsityorahuxkgusdcoiubmquycxifi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398504.9379737-2696-194445006176840/AnsiballZ_systemd_service.py'
Nov 29 06:41:45 compute-2 sudo[222099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:45 compute-2 python3.9[222101]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:41:45 compute-2 systemd[1]: Reloading.
Nov 29 06:41:45 compute-2 systemd-sysv-generator[222127]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:45 compute-2 systemd-rc-local-generator[222123]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:45 compute-2 ceph-mon[77142]: pgmap v879: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:45 compute-2 sudo[222099]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:45.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:46.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:46 compute-2 sudo[222286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpapbaqownmrjalmbnywhplzlenohkxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398506.2497313-2720-241323119293237/AnsiballZ_command.py'
Nov 29 06:41:46 compute-2 sudo[222286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:46 compute-2 python3.9[222288]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:46 compute-2 sudo[222286]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:46 compute-2 sudo[222303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:46 compute-2 sudo[222303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:46 compute-2 sudo[222303]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:46 compute-2 sudo[222347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:41:46 compute-2 sudo[222347]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:46 compute-2 sudo[222347]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:47 compute-2 sudo[222489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmndoywakxjrhefxrunvepxycfvqtwhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398506.86885-2720-274470635349002/AnsiballZ_command.py'
Nov 29 06:41:47 compute-2 sudo[222489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:47 compute-2 python3.9[222491]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:47 compute-2 sudo[222489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:47 compute-2 sudo[222642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttxxpoyziftnyhbhlebuehiycwsnqsug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398507.4240897-2720-162882201174391/AnsiballZ_command.py'
Nov 29 06:41:47 compute-2 sudo[222642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:47 compute-2 ceph-mon[77142]: pgmap v880: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:47 compute-2 python3.9[222644]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:47 compute-2 sudo[222642]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:47.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:48.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:48 compute-2 sudo[222795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moppvozqktkzdqrjvyjalbwscdgahbng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398508.0186021-2720-139551993706671/AnsiballZ_command.py'
Nov 29 06:41:48 compute-2 sudo[222795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:48 compute-2 python3.9[222797]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:48 compute-2 sudo[222795]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:49 compute-2 sudo[222949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdjlphkeyvpjdosujfynbepajnexwsvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398508.951937-2720-112659840761975/AnsiballZ_command.py'
Nov 29 06:41:49 compute-2 sudo[222949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:49 compute-2 python3.9[222951]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:49 compute-2 sudo[222949]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:49 compute-2 sudo[223102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xutdnphnrnwbpuzlgncorkeqqryxnaug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398509.581984-2720-259122775001483/AnsiballZ_command.py'
Nov 29 06:41:49 compute-2 sudo[223102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:49.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:50 compute-2 python3.9[223104]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:50 compute-2 sudo[223102]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:50 compute-2 ceph-mon[77142]: pgmap v881: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:50.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:50 compute-2 sudo[223255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgktxiumiggvbqrdyeasvddmdvwnpibf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398510.186703-2720-174860168366680/AnsiballZ_command.py'
Nov 29 06:41:50 compute-2 sudo[223255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:50 compute-2 python3.9[223257]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:50 compute-2 sudo[223255]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:50 compute-2 sudo[223344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:50 compute-2 sudo[223344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:50 compute-2 sudo[223344]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:50 compute-2 sudo[223387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:50 compute-2 sudo[223387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:50 compute-2 sudo[223387]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:50 compute-2 sudo[223459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtubfzhmyddobkkntuirdxyiocaoksf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398510.7196298-2720-251931908380389/AnsiballZ_command.py'
Nov 29 06:41:51 compute-2 sudo[223459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:51 compute-2 python3.9[223461]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:51 compute-2 sudo[223459]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:51 compute-2 ceph-mon[77142]: pgmap v882: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:51.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:52.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:52 compute-2 podman[223489]: 2025-11-29 06:41:52.911207566 +0000 UTC m=+0.075625920 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 06:41:52 compute-2 podman[223488]: 2025-11-29 06:41:52.96464473 +0000 UTC m=+0.128932651 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 06:41:53 compute-2 ceph-mon[77142]: pgmap v883: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:41:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:53.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:41:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:54.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:54 compute-2 sudo[223656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrujpsbvxiqsljlduhniihvujlvzdfoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398514.1810594-2927-17222931243208/AnsiballZ_file.py'
Nov 29 06:41:54 compute-2 sudo[223656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:54 compute-2 python3.9[223659]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:54 compute-2 sudo[223656]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:54 compute-2 sudo[223809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egowuezapqpnydznbgfwkeiqykkvuaie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398514.7624722-2927-101446227241437/AnsiballZ_file.py'
Nov 29 06:41:54 compute-2 sudo[223809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:55 compute-2 python3.9[223811]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:55 compute-2 sudo[223809]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:55 compute-2 sudo[223961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hudmukhafkuhnxxsgmsuqhjxalwuybmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398515.3037748-2927-36576125431581/AnsiballZ_file.py'
Nov 29 06:41:55 compute-2 sudo[223961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:55 compute-2 python3.9[223963]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:55 compute-2 sudo[223961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:56.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:56 compute-2 ceph-mon[77142]: pgmap v884: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:56 compute-2 sudo[224127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwwpksjlqdpyzkclqqcwacfyhpuvekpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398516.4597745-2993-269533106417379/AnsiballZ_file.py'
Nov 29 06:41:56 compute-2 sudo[224127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:56 compute-2 podman[224088]: 2025-11-29 06:41:56.74361703 +0000 UTC m=+0.056373673 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 06:41:56 compute-2 python3.9[224136]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:56 compute-2 sudo[224127]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:57 compute-2 sudo[224286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqzowxqaqyergllzjxpmqkjwvpqsqsht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398517.0553918-2993-60169960413525/AnsiballZ_file.py'
Nov 29 06:41:57 compute-2 sudo[224286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:57 compute-2 python3.9[224288]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:57 compute-2 sudo[224286]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:57 compute-2 ceph-mon[77142]: pgmap v885: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:57 compute-2 sudo[224438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiafjzwwbmjqmovmnlforrhhgekopkcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398517.7170484-2993-210186364435778/AnsiballZ_file.py'
Nov 29 06:41:57 compute-2 sudo[224438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:57.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:58 compute-2 python3.9[224440]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:58 compute-2 sudo[224438]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:58.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:58 compute-2 sudo[224591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhagbkfwjkobioulueixvgbebbhrbxpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398518.3141758-2993-100122161311608/AnsiballZ_file.py'
Nov 29 06:41:58 compute-2 sudo[224591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:58 compute-2 python3.9[224593]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:58 compute-2 sudo[224591]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:59 compute-2 sudo[224743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmdrckxtxbotkgdeukawutcftgusvagi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398518.970359-2993-84587188818018/AnsiballZ_file.py'
Nov 29 06:41:59 compute-2 sudo[224743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:59 compute-2 python3.9[224745]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:59 compute-2 sudo[224743]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:59 compute-2 sudo[224895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lygstajeulvlmgpbtqiasqknjouinsvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398519.5432682-2993-269533608842645/AnsiballZ_file.py'
Nov 29 06:41:59 compute-2 sudo[224895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:59 compute-2 ceph-mon[77142]: pgmap v886: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:41:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:59.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:00 compute-2 python3.9[224897]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:00 compute-2 sudo[224895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:00.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:00 compute-2 sudo[225048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkzwnqifmnefcvtkiorqoiwkbrbnivwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398520.164841-2993-125872918976444/AnsiballZ_file.py'
Nov 29 06:42:00 compute-2 sudo[225048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:00 compute-2 python3.9[225050]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:00 compute-2 sudo[225048]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:01 compute-2 ceph-mon[77142]: pgmap v887: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:01.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:02.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:03 compute-2 ceph-mon[77142]: pgmap v888: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:03.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:04.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:05 compute-2 ceph-mon[77142]: pgmap v889: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:05.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:06.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:07 compute-2 ceph-mon[77142]: pgmap v890: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:08.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:08.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:09 compute-2 ceph-mon[77142]: pgmap v891: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:10.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:10.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:11 compute-2 sudo[225080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:11 compute-2 sudo[225080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:11 compute-2 sudo[225080]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:11 compute-2 sudo[225105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:11 compute-2 sudo[225105]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:11 compute-2 sudo[225105]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:11 compute-2 ceph-mon[77142]: pgmap v892: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:12.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:14.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:14 compute-2 ceph-mon[77142]: pgmap v893: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:14.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:14 compute-2 sudo[225257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tweidbrtemxnetpkzamvftxhfaogxxhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398534.3017983-3318-245094955897755/AnsiballZ_getent.py'
Nov 29 06:42:14 compute-2 sudo[225257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:15 compute-2 python3.9[225259]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 29 06:42:15 compute-2 sudo[225257]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:42:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:42:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:42:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:42:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:42:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:42:15 compute-2 sudo[225410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iywuqxfksmkhpvfejdoqxhfqwpuayyeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398535.2514403-3342-180391546538455/AnsiballZ_group.py'
Nov 29 06:42:15 compute-2 sudo[225410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:15 compute-2 python3.9[225412]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:42:15 compute-2 groupadd[225413]: group added to /etc/group: name=nova, GID=42436
Nov 29 06:42:15 compute-2 groupadd[225413]: group added to /etc/gshadow: name=nova
Nov 29 06:42:15 compute-2 groupadd[225413]: new group: name=nova, GID=42436
Nov 29 06:42:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:15 compute-2 sudo[225410]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:16 compute-2 ceph-mon[77142]: pgmap v894: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:16.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:16 compute-2 sudo[225569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijfrkktvpfmtsdydqhurfwrgncscweme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398536.2287652-3366-136555432736627/AnsiballZ_user.py'
Nov 29 06:42:16 compute-2 sudo[225569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:16 compute-2 python3.9[225571]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:42:16 compute-2 useradd[225573]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 29 06:42:16 compute-2 useradd[225573]: add 'nova' to group 'libvirt'
Nov 29 06:42:16 compute-2 useradd[225573]: add 'nova' to shadow group 'libvirt'
Nov 29 06:42:17 compute-2 sudo[225569]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:17 compute-2 ceph-mon[77142]: pgmap v895: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:18.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:18.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:18 compute-2 sshd-session[225605]: Accepted publickey for zuul from 192.168.122.30 port 57934 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:42:18 compute-2 systemd-logind[784]: New session 51 of user zuul.
Nov 29 06:42:18 compute-2 systemd[1]: Started Session 51 of User zuul.
Nov 29 06:42:18 compute-2 sshd-session[225605]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:42:18 compute-2 sshd-session[225608]: Received disconnect from 192.168.122.30 port 57934:11: disconnected by user
Nov 29 06:42:18 compute-2 sshd-session[225608]: Disconnected from user zuul 192.168.122.30 port 57934
Nov 29 06:42:18 compute-2 sshd-session[225605]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:42:18 compute-2 systemd[1]: session-51.scope: Deactivated successfully.
Nov 29 06:42:18 compute-2 systemd-logind[784]: Session 51 logged out. Waiting for processes to exit.
Nov 29 06:42:18 compute-2 systemd-logind[784]: Removed session 51.
Nov 29 06:42:19 compute-2 python3.9[225758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:19 compute-2 ceph-mon[77142]: pgmap v896: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:20.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:20 compute-2 python3.9[225879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398539.0540826-3441-30724127400186/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:20.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:20 compute-2 python3.9[226030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:21 compute-2 python3.9[226106]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:21 compute-2 ceph-mon[77142]: pgmap v897: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:21 compute-2 python3.9[226256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:22.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:22.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:22 compute-2 python3.9[226377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398541.3067589-3441-233299418078229/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:22 compute-2 python3.9[226528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:23 compute-2 podman[226624]: 2025-11-29 06:42:23.300745564 +0000 UTC m=+0.053415256 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:42:23 compute-2 podman[226623]: 2025-11-29 06:42:23.330241898 +0000 UTC m=+0.082936670 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 06:42:23 compute-2 python3.9[226678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398542.3932009-3441-151088684425609/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:23 compute-2 ceph-mon[77142]: pgmap v898: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:24 compute-2 python3.9[226843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:24.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:24 compute-2 python3.9[226964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398543.5901532-3441-170291300440708/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:25 compute-2 python3.9[227115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:25 compute-2 python3.9[227236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398544.664049-3441-201786990003353/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:26.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:26 compute-2 ceph-mon[77142]: pgmap v899: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:26.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:26 compute-2 podman[227262]: 2025-11-29 06:42:26.88489776 +0000 UTC m=+0.052026561 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 06:42:27 compute-2 sudo[227407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzfrstezujottpuleosdymrslqtnlmuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398547.0830379-3690-145321336621396/AnsiballZ_file.py'
Nov 29 06:42:27 compute-2 sudo[227407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:27 compute-2 python3.9[227409]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:27 compute-2 sudo[227407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:28.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:28 compute-2 ceph-mon[77142]: pgmap v900: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:28.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:28 compute-2 sudo[227559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuelapnbtnusplaalqsnizsejfeeipux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398547.9136431-3715-263013021168545/AnsiballZ_copy.py'
Nov 29 06:42:28 compute-2 sudo[227559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:28 compute-2 python3.9[227561]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:28 compute-2 sudo[227559]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:29 compute-2 sudo[227712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqiypwsuyqrkwfwjymcasvnbbobjygwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398548.6724496-3738-162172807288783/AnsiballZ_stat.py'
Nov 29 06:42:29 compute-2 sudo[227712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:29 compute-2 python3.9[227714]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:29 compute-2 sudo[227712]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:29 compute-2 ceph-mon[77142]: pgmap v901: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:29 compute-2 sudo[227864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnnyxxzpzmgbcyppjycnrlsleijythcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398549.515259-3764-30597739619614/AnsiballZ_stat.py'
Nov 29 06:42:29 compute-2 sudo[227864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:30 compute-2 python3.9[227866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:30 compute-2 sudo[227864]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:30.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:30 compute-2 sudo[227987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usnrgaozuhhfhetdltcvliogpolvnejh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398549.515259-3764-30597739619614/AnsiballZ_copy.py'
Nov 29 06:42:30 compute-2 sudo[227987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:30 compute-2 python3.9[227989]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764398549.515259-3764-30597739619614/.source _original_basename=.wx6hmnjk follow=False checksum=80d66b7884a0d69a26deb9106b95d887b7961548 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 29 06:42:30 compute-2 sudo[227987]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:31 compute-2 sudo[228040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:31 compute-2 sudo[228040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:31 compute-2 sudo[228040]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:31 compute-2 sudo[228094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:31 compute-2 sudo[228094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:31 compute-2 sudo[228094]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:31 compute-2 python3.9[228192]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:32.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:32.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:32 compute-2 ceph-mon[77142]: pgmap v902: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:32 compute-2 python3.9[228344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:32 compute-2 python3.9[228466]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398551.8989372-3840-165265584278883/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:33 compute-2 ceph-mon[77142]: pgmap v903: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:33 compute-2 python3.9[228616]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:34.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:34 compute-2 python3.9[228737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.2663946-3885-278237524724234/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:34.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:35 compute-2 sudo[228888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqmtwhcwdrsmhwfplmbpsdtsjkwpxtij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398554.9501796-3938-22169128579436/AnsiballZ_container_config_data.py'
Nov 29 06:42:35 compute-2 sudo[228888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:35 compute-2 python3.9[228890]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 29 06:42:35 compute-2 sudo[228888]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:35 compute-2 ceph-mon[77142]: pgmap v904: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:36 compute-2 sudo[229040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nodueaecfknaiqgwbcwsqoguwxjquesg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398555.756479-3963-105019185395935/AnsiballZ_container_config_hash.py'
Nov 29 06:42:36 compute-2 sudo[229040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:36.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:36 compute-2 python3.9[229042]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:42:36 compute-2 sudo[229040]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:37 compute-2 sudo[229193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umwewncskwdyebpmtunobjqboogblvbg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398556.8423364-3992-25104909391246/AnsiballZ_edpm_container_manage.py'
Nov 29 06:42:37 compute-2 sudo[229193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:37 compute-2 python3[229195]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:42:37 compute-2 ceph-mon[77142]: pgmap v905: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:38.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:40 compute-2 ceph-mon[77142]: pgmap v906: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:40.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:40.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:41 compute-2 ceph-mon[77142]: pgmap v907: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:42.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:42.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:44.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:44.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:46.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:46.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:46 compute-2 sudo[229268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:46 compute-2 sudo[229268]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:46 compute-2 sudo[229268]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-2 sudo[229293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:42:47 compute-2 sudo[229293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:47 compute-2 sudo[229293]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-2 sudo[229318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:47 compute-2 sudo[229318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:47 compute-2 sudo[229318]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-2 sudo[229343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:42:47 compute-2 sudo[229343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:48.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:48.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:48 compute-2 ceph-mon[77142]: pgmap v908: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:49 compute-2 sudo[229343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:49 compute-2 podman[229208]: 2025-11-29 06:42:49.743085918 +0000 UTC m=+12.260450199 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:42:49 compute-2 podman[229422]: 2025-11-29 06:42:49.858985809 +0000 UTC m=+0.026630821 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:42:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:50.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:50 compute-2 ceph-mon[77142]: pgmap v909: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:50 compute-2 ceph-mon[77142]: pgmap v910: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:50.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:50 compute-2 podman[229422]: 2025-11-29 06:42:50.466998886 +0000 UTC m=+0.634643848 container create d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, container_name=nova_compute_init, tcib_managed=true)
Nov 29 06:42:50 compute-2 python3[229195]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 29 06:42:50 compute-2 sudo[229193]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:51 compute-2 sudo[229611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjseppjjnqwbynqevhjxjnnmmleocnsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398570.802072-4017-119864086566200/AnsiballZ_stat.py'
Nov 29 06:42:51 compute-2 sudo[229611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:51 compute-2 python3.9[229613]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:51 compute-2 sudo[229611]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:51 compute-2 sudo[229616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:51 compute-2 sudo[229616]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:51 compute-2 sudo[229616]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:51 compute-2 sudo[229653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:51 compute-2 sudo[229653]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:51 compute-2 sudo[229653]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:52.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:52.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:52 compute-2 sudo[229816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcateykigsucequueyugjzyskhbujaze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398572.230131-4053-25717380528661/AnsiballZ_container_config_data.py'
Nov 29 06:42:52 compute-2 sudo[229816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:52 compute-2 python3.9[229818]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 29 06:42:52 compute-2 sudo[229816]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:53 compute-2 sudo[229993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsbwaeyvkzuzohlcaotdfbfmsxmicxds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398573.2984014-4080-259199259116201/AnsiballZ_container_config_hash.py'
Nov 29 06:42:53 compute-2 sudo[229993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:53 compute-2 podman[229943]: 2025-11-29 06:42:53.858060189 +0000 UTC m=+0.053782585 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:42:53 compute-2 podman[229942]: 2025-11-29 06:42:53.884204437 +0000 UTC m=+0.082558370 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:42:54 compute-2 python3.9[230007]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:42:54 compute-2 sudo[229993]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:54.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:54.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:54 compute-2 ceph-mon[77142]: pgmap v911: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:42:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:42:54 compute-2 ceph-mon[77142]: pgmap v912: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:42:54 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:42:54 compute-2 sudo[230166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwmhirjwyahcxrgbeiglhwwfnclcvgsh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398574.6268568-4110-138432980453689/AnsiballZ_edpm_container_manage.py'
Nov 29 06:42:54 compute-2 sudo[230166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:55 compute-2 python3[230168]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:42:55 compute-2 podman[230202]: 2025-11-29 06:42:55.342889601 +0000 UTC m=+0.056968587 container create e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 06:42:55 compute-2 podman[230202]: 2025-11-29 06:42:55.308537593 +0000 UTC m=+0.022616579 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:42:55 compute-2 python3[230168]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 29 06:42:55 compute-2 sudo[230166]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:42:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:56.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:42:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:56.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:56 compute-2 ceph-mon[77142]: pgmap v913: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:56 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:42:56 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:42:56 compute-2 ceph-mon[77142]: pgmap v914: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:57 compute-2 podman[230365]: 2025-11-29 06:42:57.211762765 +0000 UTC m=+0.061515863 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:42:57 compute-2 sudo[230410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmzzbpembdakudobnqncyjsezxrqrkew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398576.8620105-4134-264223276835285/AnsiballZ_stat.py'
Nov 29 06:42:57 compute-2 sudo[230410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:57 compute-2 python3.9[230413]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:57 compute-2 sudo[230410]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:57 compute-2 ceph-mon[77142]: pgmap v915: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:58 compute-2 sudo[230565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvvsfmszwhdsvpztuqecufabdqrcyiqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398577.8005857-4160-55270182673473/AnsiballZ_file.py'
Nov 29 06:42:58 compute-2 sudo[230565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:58.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:58 compute-2 python3.9[230567]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:42:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:58 compute-2 sudo[230565]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:58 compute-2 sudo[230717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxbukhnqujltvksydjhhyqijohmcfoqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398578.3112159-4160-270685850365797/AnsiballZ_copy.py'
Nov 29 06:42:58 compute-2 sudo[230717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:58 compute-2 python3.9[230719]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398578.3112159-4160-270685850365797/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:58 compute-2 sudo[230717]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:59 compute-2 sudo[230793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztpunnfdqyhdtzurdlenwxnkxplqgwdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398578.3112159-4160-270685850365797/AnsiballZ_systemd.py'
Nov 29 06:42:59 compute-2 sudo[230793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:59 compute-2 python3.9[230795]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:42:59 compute-2 systemd[1]: Reloading.
Nov 29 06:42:59 compute-2 systemd-rc-local-generator[230821]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:42:59 compute-2 systemd-sysv-generator[230825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:42:59 compute-2 ceph-mon[77142]: pgmap v916: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:59 compute-2 sudo[230793]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:00 compute-2 sudo[230903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phcsnonfzbworqltqrtiymrbrqhrkmcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398578.3112159-4160-270685850365797/AnsiballZ_systemd.py'
Nov 29 06:43:00 compute-2 sudo[230903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:00.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:00.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:00 compute-2 python3.9[230905]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:43:00 compute-2 systemd[1]: Reloading.
Nov 29 06:43:00 compute-2 systemd-sysv-generator[230936]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:43:00 compute-2 systemd-rc-local-generator[230932]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:43:00 compute-2 systemd[1]: Starting nova_compute container...
Nov 29 06:43:00 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:43:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:00 compute-2 podman[230946]: 2025-11-29 06:43:00.911966976 +0000 UTC m=+0.101446393 container init e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:43:00 compute-2 podman[230946]: 2025-11-29 06:43:00.924469445 +0000 UTC m=+0.113948802 container start e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:43:00 compute-2 podman[230946]: nova_compute
Nov 29 06:43:00 compute-2 nova_compute[230961]: + sudo -E kolla_set_configs
Nov 29 06:43:00 compute-2 systemd[1]: Started nova_compute container.
Nov 29 06:43:00 compute-2 sudo[230903]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Validating config file
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying service configuration files
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Deleting /etc/ceph
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Creating directory /etc/ceph
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Writing out command to execute
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:01 compute-2 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:01 compute-2 nova_compute[230961]: ++ cat /run_command
Nov 29 06:43:01 compute-2 nova_compute[230961]: + CMD=nova-compute
Nov 29 06:43:01 compute-2 nova_compute[230961]: + ARGS=
Nov 29 06:43:01 compute-2 nova_compute[230961]: + sudo kolla_copy_cacerts
Nov 29 06:43:01 compute-2 nova_compute[230961]: + [[ ! -n '' ]]
Nov 29 06:43:01 compute-2 nova_compute[230961]: + . kolla_extend_start
Nov 29 06:43:01 compute-2 nova_compute[230961]: Running command: 'nova-compute'
Nov 29 06:43:01 compute-2 nova_compute[230961]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 06:43:01 compute-2 nova_compute[230961]: + umask 0022
Nov 29 06:43:01 compute-2 nova_compute[230961]: + exec nova-compute
Nov 29 06:43:01 compute-2 ceph-mon[77142]: pgmap v917: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:43:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:02.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:43:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:43:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:02.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:43:02 compute-2 python3.9[231124]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:03 compute-2 nova_compute[230961]: 2025-11-29 06:43:03.108 230965 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:03 compute-2 nova_compute[230961]: 2025-11-29 06:43:03.109 230965 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:03 compute-2 nova_compute[230961]: 2025-11-29 06:43:03.109 230965 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:03 compute-2 nova_compute[230961]: 2025-11-29 06:43:03.109 230965 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 29 06:43:03 compute-2 nova_compute[230961]: 2025-11-29 06:43:03.240 230965 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:03 compute-2 nova_compute[230961]: 2025-11-29 06:43:03.263 230965 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:03 compute-2 nova_compute[230961]: 2025-11-29 06:43:03.264 230965 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:43:03 compute-2 sudo[231214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:03 compute-2 sudo[231214]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:03 compute-2 sudo[231214]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:03 compute-2 sudo[231257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:43:03 compute-2 sudo[231257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:03 compute-2 sudo[231257]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:03 compute-2 python3.9[231328]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:03 compute-2 ceph-mon[77142]: pgmap v918: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:43:03 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:43:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:04.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.218 230965 INFO nova.virt.driver [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 29 06:43:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:04.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.342 230965 INFO nova.compute.provider_config [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.375 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.376 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.376 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.376 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.383 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.383 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.383 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.395 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.395 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.395 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.401 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.401 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.401 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.402 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.402 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.402 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.404 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.404 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.404 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.406 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.406 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.406 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.407 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.407 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.407 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.408 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.408 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.408 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.409 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.409 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.409 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.412 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.412 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.412 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.413 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.413 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.413 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.416 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.416 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.416 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.417 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.417 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.417 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.419 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.419 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.419 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.420 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.420 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.420 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.421 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.421 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.421 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.422 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.422 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.422 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 WARNING oslo_config.cfg [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 06:43:04 compute-2 nova_compute[230961]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 06:43:04 compute-2 nova_compute[230961]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 06:43:04 compute-2 nova_compute[230961]: and ``live_migration_inbound_addr`` respectively.
Nov 29 06:43:04 compute-2 nova_compute[230961]: ).  Its value may be silently ignored in the future.
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.551 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.551 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.552 230965 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.570 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.571 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.571 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.571 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 29 06:43:04 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 06:43:04 compute-2 systemd[1]: Started libvirt QEMU daemon.
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.637 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f1cdac99d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.639 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f1cdac99d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.640 230965 INFO nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Connection event '1' reason 'None'
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.661 230965 WARNING nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Nov 29 06:43:04 compute-2 nova_compute[230961]: 2025-11-29 06:43:04.661 230965 DEBUG nova.virt.libvirt.volume.mount [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 29 06:43:04 compute-2 python3.9[231479]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.458 230965 INFO nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]: 
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <host>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <uuid>4a1784f4-2c5f-4879-a5f6-acc886e56ebb</uuid>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <arch>x86_64</arch>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <microcode version='16777317'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <signature family='23' model='49' stepping='0'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='x2apic'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='tsc-deadline'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='osxsave'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='hypervisor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='tsc_adjust'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='spec-ctrl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='stibp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='arch-capabilities'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='cmp_legacy'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='topoext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='virt-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='lbrv'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='tsc-scale'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='vmcb-clean'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='pause-filter'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='pfthreshold'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='svme-addr-chk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='rdctl-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='skip-l1dfl-vmentry'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='mds-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature name='pschange-mc-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <pages unit='KiB' size='4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <pages unit='KiB' size='2048'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <pages unit='KiB' size='1048576'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <power_management>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <suspend_mem/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </power_management>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <iommu support='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <migration_features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <live/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <uri_transports>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <uri_transport>tcp</uri_transport>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <uri_transport>rdma</uri_transport>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </uri_transports>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </migration_features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <topology>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <cells num='1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <cell id='0'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           <memory unit='KiB'>7864320</memory>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           <pages unit='KiB' size='2048'>0</pages>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           <distances>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <sibling id='0' value='10'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           </distances>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           <cpus num='8'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:           </cpus>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         </cell>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </cells>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </topology>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <cache>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </cache>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <secmodel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model>selinux</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <doi>0</doi>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </secmodel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <secmodel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model>dac</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <doi>0</doi>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </secmodel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </host>
Nov 29 06:43:05 compute-2 nova_compute[230961]: 
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <guest>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <os_type>hvm</os_type>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <arch name='i686'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <wordsize>32</wordsize>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <domain type='qemu'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <domain type='kvm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </arch>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <pae/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <nonpae/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <apic default='on' toggle='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <cpuselection/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <deviceboot/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <externalSnapshot/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </guest>
Nov 29 06:43:05 compute-2 nova_compute[230961]: 
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <guest>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <os_type>hvm</os_type>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <arch name='x86_64'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <wordsize>64</wordsize>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <domain type='qemu'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <domain type='kvm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </arch>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <apic default='on' toggle='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <cpuselection/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <deviceboot/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <externalSnapshot/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </guest>
Nov 29 06:43:05 compute-2 nova_compute[230961]: 
Nov 29 06:43:05 compute-2 nova_compute[230961]: </capabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]: 
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.465 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.489 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 06:43:05 compute-2 nova_compute[230961]: <domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <arch>i686</arch>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <vcpu max='240'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <os supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='firmware'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <loader supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>rom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pflash</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='readonly'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>yes</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='secure'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </loader>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </os>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <memoryBacking supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='sourceType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>anonymous</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>memfd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </memoryBacking>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <disk supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='diskDevice'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>disk</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cdrom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>floppy</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>lun</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ide</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>fdc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>sata</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </disk>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <graphics supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vnc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egl-headless</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </graphics>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <video supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='modelType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vga</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cirrus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>none</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>bochs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ramfb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </video>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hostdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='mode'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>subsystem</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='startupPolicy'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>mandatory</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>requisite</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>optional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='subsysType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pci</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='capsType'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='pciBackend'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hostdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <rng supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>random</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </rng>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <filesystem supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='driverType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>path</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>handle</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtiofs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </filesystem>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <tpm supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-tis</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-crb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emulator</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>external</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendVersion'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>2.0</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </tpm>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <redirdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </redirdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <channel supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </channel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <crypto supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </crypto>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <interface supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>passt</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </interface>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <panic supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>isa</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>hyperv</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </panic>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <console supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>null</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dev</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pipe</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stdio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>udp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tcp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu-vdagent</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </console>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <gic supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <genid supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backup supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <async-teardown supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <ps2 supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sev supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sgx supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hyperv supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='features'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>relaxed</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vapic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>spinlocks</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vpindex</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>runtime</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>synic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stimer</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reset</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vendor_id</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>frequencies</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reenlightenment</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tlbflush</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ipi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>avic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emsr_bitmap</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>xmm_input</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hyperv>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <launchSecurity supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='sectype'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tdx</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </launchSecurity>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </features>
Nov 29 06:43:05 compute-2 nova_compute[230961]: </domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.500 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 06:43:05 compute-2 nova_compute[230961]: <domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <arch>i686</arch>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <vcpu max='4096'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <os supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='firmware'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <loader supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>rom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pflash</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='readonly'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>yes</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='secure'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </loader>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </os>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <memoryBacking supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='sourceType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>anonymous</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>memfd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </memoryBacking>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <disk supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='diskDevice'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>disk</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cdrom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>floppy</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>lun</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>fdc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>sata</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </disk>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <graphics supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vnc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egl-headless</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </graphics>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <video supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='modelType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vga</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cirrus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>none</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>bochs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ramfb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </video>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hostdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='mode'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>subsystem</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='startupPolicy'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>mandatory</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>requisite</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>optional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='subsysType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pci</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='capsType'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='pciBackend'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hostdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <rng supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>random</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </rng>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <filesystem supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='driverType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>path</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>handle</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtiofs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </filesystem>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <tpm supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-tis</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-crb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emulator</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>external</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendVersion'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>2.0</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </tpm>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <redirdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </redirdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <channel supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </channel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <crypto supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </crypto>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <interface supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>passt</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </interface>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <panic supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>isa</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>hyperv</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </panic>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <console supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>null</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dev</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pipe</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stdio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>udp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tcp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu-vdagent</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </console>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <gic supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <genid supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backup supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <async-teardown supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <ps2 supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sev supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sgx supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hyperv supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='features'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>relaxed</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vapic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>spinlocks</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vpindex</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>runtime</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>synic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stimer</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reset</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vendor_id</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>frequencies</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reenlightenment</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tlbflush</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ipi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>avic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emsr_bitmap</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>xmm_input</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hyperv>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <launchSecurity supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='sectype'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tdx</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </launchSecurity>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </features>
Nov 29 06:43:05 compute-2 nova_compute[230961]: </domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.549 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.553 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 06:43:05 compute-2 nova_compute[230961]: <domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <arch>x86_64</arch>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <vcpu max='240'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <os supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='firmware'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <loader supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>rom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pflash</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='readonly'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>yes</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='secure'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </loader>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </os>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 sudo[231693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqbeuipwhqpuxoacjqkwmobpsluwglsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398585.2260478-4341-276375154584750/AnsiballZ_podman_container.py'
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 sudo[231693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <memoryBacking supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='sourceType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>anonymous</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>memfd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </memoryBacking>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <disk supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='diskDevice'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>disk</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cdrom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>floppy</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>lun</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ide</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>fdc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>sata</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </disk>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <graphics supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vnc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egl-headless</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </graphics>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <video supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='modelType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vga</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cirrus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>none</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>bochs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ramfb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </video>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hostdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='mode'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>subsystem</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='startupPolicy'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>mandatory</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>requisite</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>optional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='subsysType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pci</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='capsType'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='pciBackend'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hostdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <rng supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>random</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </rng>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <filesystem supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='driverType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>path</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>handle</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtiofs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </filesystem>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <tpm supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-tis</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-crb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emulator</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>external</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendVersion'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>2.0</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </tpm>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <redirdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </redirdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <channel supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </channel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <crypto supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </crypto>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <interface supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>passt</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </interface>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <panic supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>isa</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>hyperv</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </panic>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <console supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>null</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dev</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pipe</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stdio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>udp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tcp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu-vdagent</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </console>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <gic supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <genid supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backup supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <async-teardown supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <ps2 supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sev supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sgx supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hyperv supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='features'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>relaxed</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vapic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>spinlocks</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vpindex</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>runtime</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>synic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stimer</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reset</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vendor_id</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>frequencies</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reenlightenment</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tlbflush</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ipi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>avic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emsr_bitmap</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>xmm_input</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hyperv>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <launchSecurity supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='sectype'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tdx</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </launchSecurity>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </features>
Nov 29 06:43:05 compute-2 nova_compute[230961]: </domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.615 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 06:43:05 compute-2 nova_compute[230961]: <domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <arch>x86_64</arch>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <vcpu max='4096'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <os supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='firmware'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>efi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <loader supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>rom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pflash</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='readonly'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>yes</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='secure'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>yes</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>no</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </loader>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </os>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>on</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>off</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xop'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='la57'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='hle'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='pku'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='erms'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='core2duo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='coreduo-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='n270-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='ss'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <blockers model='phenom-v1'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </blockers>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </mode>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <memoryBacking supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <enum name='sourceType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>anonymous</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <value>memfd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </memoryBacking>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <disk supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='diskDevice'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>disk</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cdrom</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>floppy</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>lun</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>fdc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>sata</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </disk>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <graphics supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vnc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egl-headless</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </graphics>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <video supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='modelType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vga</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>cirrus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>none</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>bochs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ramfb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </video>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hostdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='mode'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>subsystem</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='startupPolicy'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>mandatory</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>requisite</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>optional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='subsysType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pci</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>scsi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='capsType'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='pciBackend'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hostdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <rng supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>random</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>egd</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </rng>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <filesystem supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='driverType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>path</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>handle</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>virtiofs</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </filesystem>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <tpm supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-tis</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tpm-crb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emulator</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>external</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendVersion'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>2.0</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </tpm>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <redirdev supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='bus'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>usb</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </redirdev>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <channel supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </channel>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <crypto supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>builtin</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </crypto>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <interface supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='backendType'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>default</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>passt</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </interface>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <panic supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='model'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>isa</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>hyperv</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </panic>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <console supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='type'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>null</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vc</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pty</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dev</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>file</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>pipe</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stdio</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>udp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tcp</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>unix</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>qemu-vdagent</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>dbus</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </console>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </devices>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <features>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <gic supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <genid supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <backup supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <async-teardown supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <ps2 supported='yes'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sev supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <sgx supported='no'/>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <hyperv supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='features'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>relaxed</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vapic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>spinlocks</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vpindex</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>runtime</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>synic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>stimer</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reset</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>vendor_id</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>frequencies</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>reenlightenment</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tlbflush</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>ipi</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>avic</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>emsr_bitmap</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>xmm_input</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </defaults>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </hyperv>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     <launchSecurity supported='yes'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       <enum name='sectype'>
Nov 29 06:43:05 compute-2 nova_compute[230961]:         <value>tdx</value>
Nov 29 06:43:05 compute-2 nova_compute[230961]:       </enum>
Nov 29 06:43:05 compute-2 nova_compute[230961]:     </launchSecurity>
Nov 29 06:43:05 compute-2 nova_compute[230961]:   </features>
Nov 29 06:43:05 compute-2 nova_compute[230961]: </domainCapabilities>
Nov 29 06:43:05 compute-2 nova_compute[230961]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.673 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.674 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.674 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.675 230965 INFO nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Secure Boot support detected
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.677 230965 INFO nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.677 230965 INFO nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.693 230965 DEBUG nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 06:43:05 compute-2 nova_compute[230961]:   <model>Nehalem</model>
Nov 29 06:43:05 compute-2 nova_compute[230961]: </cpu>
Nov 29 06:43:05 compute-2 nova_compute[230961]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.697 230965 DEBUG nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.751 230965 INFO nova.virt.node [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Determined node identity 98b21ca7-b42c-4765-935a-26a89197ffb9 from /var/lib/nova/compute_id
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.778 230965 WARNING nova.compute.manager [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Compute nodes ['98b21ca7-b42c-4765-935a-26a89197ffb9'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.816 230965 INFO nova.compute.manager [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.863 230965 WARNING nova.compute.manager [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.863 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.864 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.864 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.864 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:43:05 compute-2 nova_compute[230961]: 2025-11-29 06:43:05.865 230965 DEBUG oslo_concurrency.processutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:05 compute-2 python3.9[231695]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 06:43:05 compute-2 ceph-mon[77142]: pgmap v919: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:06 compute-2 sudo[231693]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:06.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:06 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:43:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:06.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:43:06 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3410839554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.302 230965 DEBUG oslo_concurrency.processutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:06 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 06:43:06 compute-2 systemd[1]: Started libvirt nodedev daemon.
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.599 230965 WARNING nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.600 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5295MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.600 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.601 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.664 230965 WARNING nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] No compute node record for compute-2.ctlplane.example.com:98b21ca7-b42c-4765-935a-26a89197ffb9: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 98b21ca7-b42c-4765-935a-26a89197ffb9 could not be found.
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.702 230965 INFO nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 98b21ca7-b42c-4765-935a-26a89197ffb9
Nov 29 06:43:06 compute-2 sudo[231914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdszwdrroyiqsegrducnnlyonyquiuai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398586.4552515-4365-50291422956531/AnsiballZ_systemd.py'
Nov 29 06:43:06 compute-2 sudo[231914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.812 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:43:06 compute-2 nova_compute[230961]: 2025-11-29 06:43:06.812 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:43:06 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3410839554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:06 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3797343968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:07 compute-2 python3.9[231916]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:43:07 compute-2 systemd[1]: Stopping nova_compute container...
Nov 29 06:43:07 compute-2 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:07 compute-2 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:43:07 compute-2 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:43:07 compute-2 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:43:07 compute-2 virtqemud[231501]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 06:43:07 compute-2 virtqemud[231501]: hostname: compute-2
Nov 29 06:43:07 compute-2 virtqemud[231501]: End of file while reading data: Input/output error
Nov 29 06:43:07 compute-2 systemd[1]: libpod-e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562.scope: Deactivated successfully.
Nov 29 06:43:07 compute-2 systemd[1]: libpod-e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562.scope: Consumed 3.677s CPU time.
Nov 29 06:43:07 compute-2 podman[231920]: 2025-11-29 06:43:07.56070514 +0000 UTC m=+0.469410406 container died e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 06:43:07 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562-userdata-shm.mount: Deactivated successfully.
Nov 29 06:43:07 compute-2 systemd[1]: var-lib-containers-storage-overlay-0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52-merged.mount: Deactivated successfully.
Nov 29 06:43:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:08.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:08.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:10.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:10.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:10 compute-2 podman[231920]: 2025-11-29 06:43:10.362167102 +0000 UTC m=+3.270872378 container cleanup e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:43:10 compute-2 podman[231920]: nova_compute
Nov 29 06:43:10 compute-2 podman[231951]: nova_compute
Nov 29 06:43:10 compute-2 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 29 06:43:10 compute-2 systemd[1]: Stopped nova_compute container.
Nov 29 06:43:10 compute-2 systemd[1]: Starting nova_compute container...
Nov 29 06:43:10 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:43:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:10 compute-2 podman[231965]: 2025-11-29 06:43:10.545604132 +0000 UTC m=+0.081307690 container init e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125)
Nov 29 06:43:10 compute-2 podman[231965]: 2025-11-29 06:43:10.553652091 +0000 UTC m=+0.089355639 container start e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 06:43:10 compute-2 podman[231965]: nova_compute
Nov 29 06:43:10 compute-2 nova_compute[231979]: + sudo -E kolla_set_configs
Nov 29 06:43:10 compute-2 systemd[1]: Started nova_compute container.
Nov 29 06:43:10 compute-2 sudo[231914]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Validating config file
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying service configuration files
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /etc/ceph
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Creating directory /etc/ceph
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Writing out command to execute
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:10 compute-2 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:10 compute-2 nova_compute[231979]: ++ cat /run_command
Nov 29 06:43:10 compute-2 nova_compute[231979]: + CMD=nova-compute
Nov 29 06:43:10 compute-2 nova_compute[231979]: + ARGS=
Nov 29 06:43:10 compute-2 nova_compute[231979]: + sudo kolla_copy_cacerts
Nov 29 06:43:10 compute-2 nova_compute[231979]: + [[ ! -n '' ]]
Nov 29 06:43:10 compute-2 nova_compute[231979]: + . kolla_extend_start
Nov 29 06:43:10 compute-2 nova_compute[231979]: Running command: 'nova-compute'
Nov 29 06:43:10 compute-2 nova_compute[231979]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 06:43:10 compute-2 nova_compute[231979]: + umask 0022
Nov 29 06:43:10 compute-2 nova_compute[231979]: + exec nova-compute
Nov 29 06:43:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:11 compute-2 ceph-mon[77142]: pgmap v920: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:11 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2326258606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:11 compute-2 sudo[232016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:11 compute-2 sudo[232016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:11 compute-2 sudo[232016]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:11 compute-2 sudo[232041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:11 compute-2 sudo[232041]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:11 compute-2 sudo[232041]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:12.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:12 compute-2 sudo[232191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtbvrhzbbuhowoqfwwewtmcabbjziwhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398591.9528124-4392-54343952067749/AnsiballZ_podman_container.py'
Nov 29 06:43:12 compute-2 sudo[232191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:12.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:12 compute-2 python3.9[232193]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 06:43:12 compute-2 systemd[1]: Started libpod-conmon-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2.scope.
Nov 29 06:43:12 compute-2 nova_compute[231979]: 2025-11-29 06:43:12.678 231983 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:12 compute-2 nova_compute[231979]: 2025-11-29 06:43:12.680 231983 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:12 compute-2 nova_compute[231979]: 2025-11-29 06:43:12.680 231983 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:12 compute-2 nova_compute[231979]: 2025-11-29 06:43:12.681 231983 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 29 06:43:12 compute-2 ceph-mon[77142]: pgmap v921: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:12 compute-2 ceph-mon[77142]: pgmap v922: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:12 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:43:12 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-2 podman[232222]: 2025-11-29 06:43:12.717039106 +0000 UTC m=+0.108277622 container init d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, config_id=edpm, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible)
Nov 29 06:43:12 compute-2 podman[232222]: 2025-11-29 06:43:12.723977467 +0000 UTC m=+0.115215963 container start d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:43:12 compute-2 python3.9[232193]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Applying nova statedir ownership
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 29 06:43:12 compute-2 nova_compute_init[232244]: INFO:nova_statedir:Nova statedir ownership complete
Nov 29 06:43:12 compute-2 systemd[1]: libpod-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2.scope: Deactivated successfully.
Nov 29 06:43:12 compute-2 nova_compute[231979]: 2025-11-29 06:43:12.826 231983 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:12 compute-2 podman[232259]: 2025-11-29 06:43:12.843170512 +0000 UTC m=+0.038793171 container died d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 29 06:43:12 compute-2 nova_compute[231979]: 2025-11-29 06:43:12.849 231983 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:12 compute-2 nova_compute[231979]: 2025-11-29 06:43:12.850 231983 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:43:12 compute-2 sudo[232191]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:12 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2-userdata-shm.mount: Deactivated successfully.
Nov 29 06:43:12 compute-2 systemd[1]: var-lib-containers-storage-overlay-9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b-merged.mount: Deactivated successfully.
Nov 29 06:43:12 compute-2 podman[232259]: 2025-11-29 06:43:12.875910175 +0000 UTC m=+0.071532814 container cleanup d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 29 06:43:12 compute-2 systemd[1]: libpod-conmon-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2.scope: Deactivated successfully.
Nov 29 06:43:14 compute-2 ceph-mon[77142]: pgmap v923: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:14.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:14.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:43:15.134 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:43:15.134 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:43:15.135 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:15 compute-2 ceph-mon[77142]: pgmap v924: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:16.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:16 compute-2 sshd-session[201819]: Connection closed by 192.168.122.30 port 44458
Nov 29 06:43:16 compute-2 sshd-session[201816]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:43:16 compute-2 systemd[1]: session-50.scope: Deactivated successfully.
Nov 29 06:43:16 compute-2 systemd[1]: session-50.scope: Consumed 2min 16.828s CPU time.
Nov 29 06:43:16 compute-2 systemd-logind[784]: Session 50 logged out. Waiting for processes to exit.
Nov 29 06:43:16 compute-2 systemd-logind[784]: Removed session 50.
Nov 29 06:43:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:16.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.064 231983 INFO nova.virt.driver [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.177 231983 INFO nova.compute.provider_config [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.694 231983 DEBUG oslo_concurrency.lockutils [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.695 231983 DEBUG oslo_concurrency.lockutils [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.695 231983 DEBUG oslo_concurrency.lockutils [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.696 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.696 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.699 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.699 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.699 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.700 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.700 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.700 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.702 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.702 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.702 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.703 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.703 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.703 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.704 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.704 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.705 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.714 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.714 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.714 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 WARNING oslo_config.cfg [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 06:43:17 compute-2 nova_compute[231979]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 06:43:17 compute-2 nova_compute[231979]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 06:43:17 compute-2 nova_compute[231979]: and ``live_migration_inbound_addr`` respectively.
Nov 29 06:43:17 compute-2 nova_compute[231979]: ).  Its value may be silently ignored in the future.
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:43:17 compute-2 nova_compute[231979]: 2025-11-29 06:43:17.858 231983 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 29 06:43:17 compute-2 ceph-mon[77142]: pgmap v925: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:18.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:18.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.317 231983 INFO nova.virt.node [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Determined node identity 98b21ca7-b42c-4765-935a-26a89197ffb9 from /var/lib/nova/compute_id
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.318 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.319 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.319 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.319 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.332 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f43a411bb20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.337 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f43a411bb20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.337 231983 INFO nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Connection event '1' reason 'None'
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.344 231983 INFO nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]: 
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <host>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <uuid>4a1784f4-2c5f-4879-a5f6-acc886e56ebb</uuid>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <arch>x86_64</arch>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model>EPYC-Rome-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <vendor>AMD</vendor>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <microcode version='16777317'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <signature family='23' model='49' stepping='0'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='x2apic'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='tsc-deadline'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='osxsave'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='hypervisor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='tsc_adjust'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='spec-ctrl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='stibp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='arch-capabilities'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='cmp_legacy'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='topoext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='virt-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='lbrv'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='tsc-scale'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='vmcb-clean'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='pause-filter'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='pfthreshold'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='svme-addr-chk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='rdctl-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='skip-l1dfl-vmentry'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='mds-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature name='pschange-mc-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <pages unit='KiB' size='4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <pages unit='KiB' size='2048'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <pages unit='KiB' size='1048576'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <power_management>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <suspend_mem/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </power_management>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <iommu support='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <migration_features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <live/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <uri_transports>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <uri_transport>tcp</uri_transport>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <uri_transport>rdma</uri_transport>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </uri_transports>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </migration_features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <topology>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <cells num='1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <cell id='0'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           <memory unit='KiB'>7864320</memory>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           <pages unit='KiB' size='2048'>0</pages>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           <distances>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <sibling id='0' value='10'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           </distances>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           <cpus num='8'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:           </cpus>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         </cell>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </cells>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </topology>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <cache>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </cache>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <secmodel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model>selinux</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <doi>0</doi>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </secmodel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <secmodel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model>dac</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <doi>0</doi>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </secmodel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </host>
Nov 29 06:43:19 compute-2 nova_compute[231979]: 
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <guest>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <os_type>hvm</os_type>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <arch name='i686'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <wordsize>32</wordsize>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <domain type='qemu'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <domain type='kvm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </arch>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <pae/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <nonpae/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <apic default='on' toggle='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <cpuselection/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <deviceboot/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <externalSnapshot/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </guest>
Nov 29 06:43:19 compute-2 nova_compute[231979]: 
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <guest>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <os_type>hvm</os_type>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <arch name='x86_64'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <wordsize>64</wordsize>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <domain type='qemu'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <domain type='kvm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </arch>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <apic default='on' toggle='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <cpuselection/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <deviceboot/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <externalSnapshot/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </guest>
Nov 29 06:43:19 compute-2 nova_compute[231979]: 
Nov 29 06:43:19 compute-2 nova_compute[231979]: </capabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]: 
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.350 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.353 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 06:43:19 compute-2 nova_compute[231979]: <domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <domain>kvm</domain>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <arch>i686</arch>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <vcpu max='4096'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <iothreads supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <os supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='firmware'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <loader supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>rom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pflash</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='readonly'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>yes</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='secure'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </loader>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </os>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='maximumMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <vendor>AMD</vendor>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='succor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='custom' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-128'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-256'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-512'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <memoryBacking supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='sourceType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>anonymous</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>memfd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </memoryBacking>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <disk supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='diskDevice'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>disk</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cdrom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>floppy</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>lun</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>fdc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>sata</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </disk>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <graphics supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vnc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egl-headless</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </graphics>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <video supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='modelType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vga</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cirrus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>none</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>bochs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ramfb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </video>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hostdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='mode'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>subsystem</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='startupPolicy'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>mandatory</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>requisite</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>optional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='subsysType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pci</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='capsType'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='pciBackend'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hostdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <rng supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>random</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </rng>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <filesystem supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='driverType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>path</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>handle</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtiofs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </filesystem>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <tpm supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-tis</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-crb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emulator</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>external</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendVersion'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>2.0</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </tpm>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <redirdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </redirdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <channel supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </channel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <crypto supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </crypto>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <interface supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>passt</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </interface>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <panic supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>isa</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>hyperv</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </panic>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <console supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>null</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dev</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pipe</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stdio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>udp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tcp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu-vdagent</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </console>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <gic supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <genid supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backup supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <async-teardown supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <ps2 supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sev supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sgx supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hyperv supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='features'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>relaxed</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vapic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>spinlocks</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vpindex</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>runtime</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>synic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stimer</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reset</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vendor_id</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>frequencies</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reenlightenment</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tlbflush</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ipi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>avic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emsr_bitmap</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>xmm_input</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hyperv>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <launchSecurity supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='sectype'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tdx</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </launchSecurity>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </features>
Nov 29 06:43:19 compute-2 nova_compute[231979]: </domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.357 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 06:43:19 compute-2 nova_compute[231979]: <domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <domain>kvm</domain>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <arch>i686</arch>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <vcpu max='240'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <iothreads supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <os supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='firmware'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <loader supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>rom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pflash</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='readonly'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>yes</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='secure'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </loader>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </os>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='maximumMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <vendor>AMD</vendor>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='succor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='custom' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-128'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-256'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-512'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <memoryBacking supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='sourceType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>anonymous</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>memfd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </memoryBacking>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <disk supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='diskDevice'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>disk</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cdrom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>floppy</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>lun</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ide</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>fdc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>sata</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </disk>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <graphics supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vnc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egl-headless</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </graphics>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <video supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='modelType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vga</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cirrus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>none</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>bochs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ramfb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </video>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hostdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='mode'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>subsystem</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='startupPolicy'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>mandatory</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>requisite</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>optional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='subsysType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pci</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='capsType'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='pciBackend'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hostdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <rng supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>random</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </rng>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <filesystem supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='driverType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>path</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>handle</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtiofs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </filesystem>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <tpm supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-tis</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-crb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emulator</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>external</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendVersion'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>2.0</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </tpm>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <redirdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </redirdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <channel supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </channel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <crypto supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </crypto>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <interface supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>passt</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </interface>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <panic supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>isa</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>hyperv</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </panic>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <console supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>null</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dev</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pipe</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stdio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>udp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tcp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu-vdagent</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </console>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <gic supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <genid supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backup supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <async-teardown supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <ps2 supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sev supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sgx supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hyperv supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='features'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>relaxed</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vapic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>spinlocks</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vpindex</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>runtime</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>synic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stimer</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reset</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vendor_id</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>frequencies</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reenlightenment</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tlbflush</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ipi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>avic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emsr_bitmap</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>xmm_input</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hyperv>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <launchSecurity supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='sectype'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tdx</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </launchSecurity>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </features>
Nov 29 06:43:19 compute-2 nova_compute[231979]: </domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.383 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.389 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 06:43:19 compute-2 nova_compute[231979]: <domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <domain>kvm</domain>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <arch>x86_64</arch>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <vcpu max='4096'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <iothreads supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <os supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='firmware'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>efi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <loader supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>rom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pflash</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='readonly'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>yes</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='secure'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>yes</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </loader>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </os>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='maximumMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <vendor>AMD</vendor>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='succor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='custom' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-128'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-256'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-512'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <memoryBacking supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='sourceType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>anonymous</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>memfd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </memoryBacking>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <disk supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='diskDevice'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>disk</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cdrom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>floppy</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>lun</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>fdc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>sata</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </disk>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <graphics supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vnc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egl-headless</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </graphics>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <video supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='modelType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vga</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cirrus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>none</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>bochs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ramfb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </video>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hostdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='mode'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>subsystem</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='startupPolicy'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>mandatory</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>requisite</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>optional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='subsysType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pci</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='capsType'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='pciBackend'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hostdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <rng supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>random</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </rng>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <filesystem supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='driverType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>path</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>handle</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtiofs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </filesystem>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <tpm supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-tis</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-crb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emulator</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>external</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendVersion'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>2.0</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </tpm>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <redirdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </redirdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <channel supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </channel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <crypto supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </crypto>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <interface supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>passt</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </interface>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <panic supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>isa</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>hyperv</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </panic>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <console supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>null</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dev</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pipe</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stdio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>udp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tcp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu-vdagent</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </console>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <gic supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <genid supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backup supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <async-teardown supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <ps2 supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sev supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sgx supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hyperv supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='features'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>relaxed</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vapic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>spinlocks</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vpindex</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>runtime</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>synic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stimer</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reset</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vendor_id</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>frequencies</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reenlightenment</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tlbflush</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ipi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>avic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emsr_bitmap</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>xmm_input</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hyperv>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <launchSecurity supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='sectype'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tdx</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </launchSecurity>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </features>
Nov 29 06:43:19 compute-2 nova_compute[231979]: </domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.468 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 06:43:19 compute-2 nova_compute[231979]: <domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <domain>kvm</domain>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <arch>x86_64</arch>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <vcpu max='240'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <iothreads supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <os supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='firmware'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <loader supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>rom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pflash</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='readonly'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>yes</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='secure'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>no</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </loader>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </os>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='maximumMigratable'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>on</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>off</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <vendor>AMD</vendor>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='succor'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <mode name='custom' supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Denverton-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='auto-ibrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amd-psfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='stibp-always-on'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='EPYC-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-128'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-256'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx10-512'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='prefetchiti'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Haswell-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512er'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512pf'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fma4'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tbm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xop'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='amx-tile'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-bf16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-fp16'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bitalg'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrc'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fzrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='la57'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='taa-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xfd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ifma'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cmpccxadd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fbsdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='fsrs'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ibrs-all'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mcdt-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pbrsb-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='psdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='serialize'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vaes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='hle'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='rtm'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512bw'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512cd'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512dq'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512f'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='avx512vl'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='invpcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pcid'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='pku'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='mpx'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='core-capability'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='split-lock-detect'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='cldemote'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='erms'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='gfni'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdir64b'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='movdiri'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='xsaves'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='athlon-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='core2duo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='coreduo-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='n270-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='ss'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <blockers model='phenom-v1'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnow'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <feature name='3dnowext'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </blockers>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </mode>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <memoryBacking supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <enum name='sourceType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>anonymous</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <value>memfd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </memoryBacking>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <disk supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='diskDevice'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>disk</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cdrom</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>floppy</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>lun</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ide</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>fdc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>sata</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </disk>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <graphics supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vnc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egl-headless</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </graphics>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <video supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='modelType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vga</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>cirrus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>none</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>bochs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ramfb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </video>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hostdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='mode'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>subsystem</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='startupPolicy'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>mandatory</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>requisite</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>optional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='subsysType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pci</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>scsi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='capsType'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='pciBackend'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hostdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <rng supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtio-non-transitional</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>random</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>egd</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </rng>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <filesystem supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='driverType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>path</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>handle</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>virtiofs</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </filesystem>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <tpm supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-tis</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tpm-crb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emulator</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>external</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendVersion'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>2.0</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </tpm>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <redirdev supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='bus'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>usb</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </redirdev>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <channel supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </channel>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <crypto supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendModel'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>builtin</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </crypto>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <interface supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='backendType'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>default</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>passt</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </interface>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <panic supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='model'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>isa</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>hyperv</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </panic>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <console supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='type'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>null</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vc</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pty</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dev</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>file</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>pipe</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stdio</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>udp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tcp</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>unix</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>qemu-vdagent</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>dbus</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </console>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </devices>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <features>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <gic supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <genid supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <backup supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <async-teardown supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <ps2 supported='yes'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sev supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <sgx supported='no'/>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <hyperv supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='features'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>relaxed</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vapic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>spinlocks</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vpindex</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>runtime</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>synic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>stimer</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reset</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>vendor_id</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>frequencies</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>reenlightenment</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tlbflush</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>ipi</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>avic</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>emsr_bitmap</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>xmm_input</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </defaults>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </hyperv>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     <launchSecurity supported='yes'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       <enum name='sectype'>
Nov 29 06:43:19 compute-2 nova_compute[231979]:         <value>tdx</value>
Nov 29 06:43:19 compute-2 nova_compute[231979]:       </enum>
Nov 29 06:43:19 compute-2 nova_compute[231979]:     </launchSecurity>
Nov 29 06:43:19 compute-2 nova_compute[231979]:   </features>
Nov 29 06:43:19 compute-2 nova_compute[231979]: </domainCapabilities>
Nov 29 06:43:19 compute-2 nova_compute[231979]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.536 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.536 231983 INFO nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Secure Boot support detected
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.538 231983 INFO nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.538 231983 INFO nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.547 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 06:43:19 compute-2 nova_compute[231979]:   <model>Nehalem</model>
Nov 29 06:43:19 compute-2 nova_compute[231979]: </cpu>
Nov 29 06:43:19 compute-2 nova_compute[231979]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 29 06:43:19 compute-2 nova_compute[231979]: 2025-11-29 06:43:19.549 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 29 06:43:19 compute-2 ceph-mon[77142]: pgmap v926: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:20.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:20 compute-2 nova_compute[231979]: 2025-11-29 06:43:20.223 231983 DEBUG nova.virt.libvirt.volume.mount [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 29 06:43:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:20.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:20 compute-2 nova_compute[231979]: 2025-11-29 06:43:20.638 231983 INFO nova.virt.node [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Determined node identity 98b21ca7-b42c-4765-935a-26a89197ffb9 from /var/lib/nova/compute_id
Nov 29 06:43:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:20 compute-2 nova_compute[231979]: 2025-11-29 06:43:20.936 231983 DEBUG nova.compute.manager [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Verified node 98b21ca7-b42c-4765-935a-26a89197ffb9 matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 29 06:43:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:22.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:22.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:22 compute-2 nova_compute[231979]: 2025-11-29 06:43:22.320 231983 INFO nova.compute.manager [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 29 06:43:22 compute-2 ceph-mon[77142]: pgmap v927: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:22 compute-2 nova_compute[231979]: 2025-11-29 06:43:22.767 231983 ERROR nova.compute.manager [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Could not retrieve compute node resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-e6c1b923-6add-4afe-8ca9-60e8e7cf5088"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-e6c1b923-6add-4afe-8ca9-60e8e7cf5088"}]}
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.080 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.080 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.080 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.081 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.081 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:24.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:24 compute-2 ceph-mon[77142]: pgmap v928: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:24.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:43:24 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1216257790' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.484 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.701 231983 WARNING nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.702 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5289MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.702 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:24 compute-2 nova_compute[231979]: 2025-11-29 06:43:24.702 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:24 compute-2 podman[232360]: 2025-11-29 06:43:24.925505837 +0000 UTC m=+0.072346476 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:43:25 compute-2 podman[232359]: 2025-11-29 06:43:25.060277129 +0000 UTC m=+0.207260311 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:43:25 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1216257790' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:25 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/766159143' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:25 compute-2 ceph-mon[77142]: pgmap v929: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:25 compute-2 nova_compute[231979]: 2025-11-29 06:43:25.455 231983 ERROR nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-ef7a20f9-7910-4778-ace3-4e24adb16a59"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-ef7a20f9-7910-4778-ace3-4e24adb16a59"}]}
Nov 29 06:43:25 compute-2 nova_compute[231979]: 2025-11-29 06:43:25.456 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:43:25 compute-2 nova_compute[231979]: 2025-11-29 06:43:25.456 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:43:25 compute-2 nova_compute[231979]: 2025-11-29 06:43:25.663 231983 INFO nova.scheduler.client.report [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] [req-1d3f7d80-8081-4395-8499-b37bcb8b8af4] Created resource provider record via placement API for resource provider with UUID 98b21ca7-b42c-4765-935a-26a89197ffb9 and name compute-2.ctlplane.example.com.
Nov 29 06:43:25 compute-2 nova_compute[231979]: 2025-11-29 06:43:25.760 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:26.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:43:26 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2300023521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.214 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.221 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 06:43:26 compute-2 nova_compute[231979]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.221 231983 INFO nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] kernel doesn't support AMD SEV
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.223 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.224 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.230 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 06:43:26 compute-2 nova_compute[231979]:   <arch>x86_64</arch>
Nov 29 06:43:26 compute-2 nova_compute[231979]:   <model>Nehalem</model>
Nov 29 06:43:26 compute-2 nova_compute[231979]:   <vendor>AMD</vendor>
Nov 29 06:43:26 compute-2 nova_compute[231979]:   <topology sockets="8" cores="1" threads="1"/>
Nov 29 06:43:26 compute-2 nova_compute[231979]: </cpu>
Nov 29 06:43:26 compute-2 nova_compute[231979]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 29 06:43:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:26.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.599 231983 DEBUG nova.scheduler.client.report [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updated inventory for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.600 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.600 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:43:26 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3359582458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:26 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2300023521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:26 compute-2 nova_compute[231979]: 2025-11-29 06:43:26.765 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 29 06:43:27 compute-2 nova_compute[231979]: 2025-11-29 06:43:27.322 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:43:27 compute-2 nova_compute[231979]: 2025-11-29 06:43:27.323 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:27 compute-2 nova_compute[231979]: 2025-11-29 06:43:27.323 231983 DEBUG nova.service [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 29 06:43:27 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2410258570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:27 compute-2 ceph-mon[77142]: pgmap v930: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:27 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/570139792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:27 compute-2 podman[232429]: 2025-11-29 06:43:27.892832439 +0000 UTC m=+0.057856398 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 29 06:43:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:28.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:28.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:28 compute-2 nova_compute[231979]: 2025-11-29 06:43:28.663 231983 DEBUG nova.service [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 29 06:43:28 compute-2 nova_compute[231979]: 2025-11-29 06:43:28.664 231983 DEBUG nova.servicegroup.drivers.db [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 29 06:43:30 compute-2 ceph-mon[77142]: pgmap v931: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:30.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:30.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:31 compute-2 sudo[232451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:31 compute-2 sudo[232451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:31 compute-2 sudo[232451]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:31 compute-2 sudo[232476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:31 compute-2 sudo[232476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:31 compute-2 sudo[232476]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:31 compute-2 ceph-mon[77142]: pgmap v932: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:32.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:32.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:33 compute-2 nova_compute[231979]: 2025-11-29 06:43:33.667 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:33 compute-2 ceph-mon[77142]: pgmap v933: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:34.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:34.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:34 compute-2 nova_compute[231979]: 2025-11-29 06:43:34.288 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:36.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:36 compute-2 ceph-mon[77142]: pgmap v934: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:36.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:37 compute-2 ceph-mon[77142]: pgmap v935: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:38.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:38.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:40.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:40 compute-2 ceph-mon[77142]: pgmap v936: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:40.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:41 compute-2 ceph-mon[77142]: pgmap v937: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:42.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:42.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:43 compute-2 ceph-mon[77142]: pgmap v938: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:44.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:44.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:45 compute-2 ceph-mon[77142]: pgmap v939: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:46.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:46.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:47 compute-2 ceph-mon[77142]: pgmap v940: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:48.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:48.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:49 compute-2 ceph-mon[77142]: pgmap v941: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:50.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:50.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:51 compute-2 ceph-mon[77142]: pgmap v942: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:51 compute-2 sudo[232511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:51 compute-2 sudo[232511]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:51 compute-2 sudo[232511]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:51 compute-2 sudo[232536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:51 compute-2 sudo[232536]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:52 compute-2 sudo[232536]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:52.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:52.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:53 compute-2 sshd-session[232562]: Invalid user ubuntu from 92.118.39.92 port 55710
Nov 29 06:43:53 compute-2 ceph-mon[77142]: pgmap v943: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:54 compute-2 sshd-session[232562]: Connection closed by invalid user ubuntu 92.118.39.92 port 55710 [preauth]
Nov 29 06:43:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:54.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:54.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:55 compute-2 podman[232566]: 2025-11-29 06:43:55.892539464 +0000 UTC m=+0.051125673 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 06:43:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:55 compute-2 podman[232565]: 2025-11-29 06:43:55.919838815 +0000 UTC m=+0.080199380 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:43:55 compute-2 ceph-mon[77142]: pgmap v944: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:56.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:56.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:57 compute-2 ceph-mon[77142]: pgmap v945: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:58.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:43:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:58.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:58 compute-2 podman[232612]: 2025-11-29 06:43:58.903669856 +0000 UTC m=+0.068095675 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 06:44:00 compute-2 ceph-mon[77142]: pgmap v946: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:00.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:00.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:02.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:02 compute-2 ceph-mon[77142]: pgmap v947: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:02.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:03 compute-2 sudo[232634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:03 compute-2 sudo[232634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:03 compute-2 sudo[232634]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:03 compute-2 ceph-mon[77142]: pgmap v948: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:03 compute-2 sudo[232659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:44:03 compute-2 sudo[232659]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:03 compute-2 sudo[232659]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:03 compute-2 sudo[232684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:03 compute-2 sudo[232684]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:03 compute-2 sudo[232684]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:03 compute-2 sudo[232709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:44:03 compute-2 sudo[232709]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:04.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:04 compute-2 sudo[232709]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:04.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:44:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:44:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:44:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:44:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:44:04 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:44:05 compute-2 ceph-mon[77142]: pgmap v949: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:06.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:06.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:07 compute-2 ceph-mon[77142]: pgmap v950: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:08.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:08.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:09 compute-2 ceph-mon[77142]: pgmap v951: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:10.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:10.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:11 compute-2 sudo[232769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:11 compute-2 sudo[232769]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:11 compute-2 sudo[232769]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:11 compute-2 sudo[232794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:44:11 compute-2 sudo[232794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:11 compute-2 sudo[232794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:12 compute-2 ceph-mon[77142]: pgmap v952: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:44:12 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:44:12 compute-2 sudo[232819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:12 compute-2 sudo[232819]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:12 compute-2 sudo[232819]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:12 compute-2 sudo[232844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:12 compute-2 sudo[232844]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:12 compute-2 sudo[232844]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:12.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:12.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:12 compute-2 nova_compute[231979]: 2025-11-29 06:44:12.863 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:12 compute-2 nova_compute[231979]: 2025-11-29 06:44:12.863 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:12 compute-2 nova_compute[231979]: 2025-11-29 06:44:12.864 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:44:12 compute-2 nova_compute[231979]: 2025-11-29 06:44:12.864 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:44:14 compute-2 ceph-mon[77142]: pgmap v953: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:14.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:14.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:44:15.134 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:44:15.135 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:44:15.135 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:16.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:16 compute-2 ceph-mon[77142]: pgmap v954: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:16.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:17 compute-2 ceph-mon[77142]: pgmap v955: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.440 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.77 sec
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.459 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.459 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.459 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.461 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:44:17 compute-2 nova_compute[231979]: 2025-11-29 06:44:17.461 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:18.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:18.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:19 compute-2 ceph-mon[77142]: pgmap v956: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:20.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:20.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:22.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:22.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:22 compute-2 ceph-mon[77142]: pgmap v957: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:24.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:24.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:25 compute-2 ceph-mon[77142]: pgmap v958: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:26 compute-2 ceph-mon[77142]: pgmap v959: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:26.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:26.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:26 compute-2 podman[232878]: 2025-11-29 06:44:26.886660509 +0000 UTC m=+0.049935733 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 06:44:26 compute-2 podman[232877]: 2025-11-29 06:44:26.920753964 +0000 UTC m=+0.082334203 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:44:28 compute-2 ceph-mon[77142]: pgmap v960: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:28.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:28.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:29 compute-2 ceph-mon[77142]: pgmap v961: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:29 compute-2 podman[232922]: 2025-11-29 06:44:29.883682507 +0000 UTC m=+0.051152835 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd)
Nov 29 06:44:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:30.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:30.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:32 compute-2 ceph-mon[77142]: pgmap v962: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:32.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:32 compute-2 sudo[232943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:32 compute-2 sudo[232943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:32 compute-2 sudo[232943]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:32 compute-2 sudo[232968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:32 compute-2 sudo[232968]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:32 compute-2 sudo[232968]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:32.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:33 compute-2 ceph-mon[77142]: pgmap v963: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:34.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:34.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.109 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.110 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.110 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.111 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.112 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:44:36 compute-2 ceph-mon[77142]: pgmap v964: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.134 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 8.69 sec
Nov 29 06:44:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:36.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:36.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:44:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1970020033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.574 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.724 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.725 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5332MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.725 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:36 compute-2 nova_compute[231979]: 2025-11-29 06:44:36.726 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:37 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2267221102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:37 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2284150235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:37 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1970020033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:38 compute-2 ceph-mon[77142]: pgmap v965: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:38.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:38.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:39 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1438398855' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:39 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1438398855' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:39 compute-2 ceph-mon[77142]: pgmap v966: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:40 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:40 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:40.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:40.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:40 compute-2 nova_compute[231979]: 2025-11-29 06:44:40.485 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:44:40 compute-2 nova_compute[231979]: 2025-11-29 06:44:40.485 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:44:40 compute-2 nova_compute[231979]: 2025-11-29 06:44:40.543 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:44:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.926786) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680926873, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2761, "num_deletes": 509, "total_data_size": 6471047, "memory_usage": 6557936, "flush_reason": "Manual Compaction"}
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680959675, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4239117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15313, "largest_seqno": 18069, "table_properties": {"data_size": 4228438, "index_size": 6469, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 23481, "raw_average_key_size": 18, "raw_value_size": 4205271, "raw_average_value_size": 3380, "num_data_blocks": 289, "num_entries": 1244, "num_filter_entries": 1244, "num_deletions": 509, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398425, "oldest_key_time": 1764398425, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 32896 microseconds, and 8263 cpu microseconds.
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.959713) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4239117 bytes OK
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.959732) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.961628) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.961644) EVENT_LOG_v1 {"time_micros": 1764398680961640, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.961659) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 6458171, prev total WAL file size 6458171, number of live WAL files 2.
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.963147) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323535' seq:0, type:0; will stop at (end)
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4139KB)], [30(9302KB)]
Nov 29 06:44:40 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680963196, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 13764729, "oldest_snapshot_seqno": -1}
Nov 29 06:44:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:44:40 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/10673063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:40 compute-2 nova_compute[231979]: 2025-11-29 06:44:40.982 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:44:40 compute-2 nova_compute[231979]: 2025-11-29 06:44:40.987 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4723 keys, 11178089 bytes, temperature: kUnknown
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681043712, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11178089, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11142641, "index_size": 22538, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11845, "raw_key_size": 118499, "raw_average_key_size": 25, "raw_value_size": 11053323, "raw_average_value_size": 2340, "num_data_blocks": 935, "num_entries": 4723, "num_filter_entries": 4723, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.044207) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11178089 bytes
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.045670) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 170.5 rd, 138.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 5757, records dropped: 1034 output_compression: NoCompression
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.045697) EVENT_LOG_v1 {"time_micros": 1764398681045685, "job": 16, "event": "compaction_finished", "compaction_time_micros": 80730, "compaction_time_cpu_micros": 24703, "output_level": 6, "num_output_files": 1, "total_output_size": 11178089, "num_input_records": 5757, "num_output_records": 4723, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681046629, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681048719, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.963048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-2 nova_compute[231979]: 2025-11-29 06:44:41.298 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:44:41 compute-2 nova_compute[231979]: 2025-11-29 06:44:41.300 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:44:41 compute-2 nova_compute[231979]: 2025-11-29 06:44:41.300 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:42.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:42.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:42 compute-2 ceph-mon[77142]: pgmap v967: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:42 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/945539473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:42 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/10673063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:42 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3344413637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:43 compute-2 ceph-mon[77142]: pgmap v968: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:44.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:44.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:45 compute-2 ceph-mon[77142]: pgmap v969: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:46.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:46.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:47 compute-2 ceph-mon[77142]: pgmap v970: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:48.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:48.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:50 compute-2 ceph-mon[77142]: pgmap v971: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:50.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:50.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:51 compute-2 ceph-mon[77142]: pgmap v972: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:52.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:52.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:52 compute-2 sudo[233047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:52 compute-2 sudo[233047]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:52 compute-2 sudo[233047]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:52 compute-2 sudo[233073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:52 compute-2 sudo[233073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:52 compute-2 sudo[233073]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:53 compute-2 ceph-mon[77142]: pgmap v973: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:54.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:54.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:56 compute-2 ceph-mon[77142]: pgmap v974: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:56.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:56.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:57 compute-2 ceph-mon[77142]: pgmap v975: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:57 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:44:57 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:57 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:44:57 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:57 compute-2 podman[233101]: 2025-11-29 06:44:57.884374586 +0000 UTC m=+0.049064379 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:44:57 compute-2 podman[233100]: 2025-11-29 06:44:57.914871117 +0000 UTC m=+0.081151372 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:44:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:58.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:44:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:58.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:58 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:58 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:59 compute-2 ceph-mon[77142]: pgmap v976: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:00.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:00.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:00 compute-2 podman[233145]: 2025-11-29 06:45:00.884891615 +0000 UTC m=+0.050208359 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:45:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:01 compute-2 ceph-mon[77142]: pgmap v977: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:02.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:02.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.881314) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702881437, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 459, "num_deletes": 251, "total_data_size": 609291, "memory_usage": 619112, "flush_reason": "Manual Compaction"}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702889173, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 401811, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18074, "largest_seqno": 18528, "table_properties": {"data_size": 399311, "index_size": 600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6105, "raw_average_key_size": 18, "raw_value_size": 394334, "raw_average_value_size": 1209, "num_data_blocks": 28, "num_entries": 326, "num_filter_entries": 326, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398682, "oldest_key_time": 1764398682, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 7907 microseconds, and 4279 cpu microseconds.
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889239) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 401811 bytes OK
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889271) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.891429) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.891460) EVENT_LOG_v1 {"time_micros": 1764398702891453, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.891481) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 606475, prev total WAL file size 606475, number of live WAL files 2.
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.892096) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(392KB)], [33(10MB)]
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702892341, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 11579900, "oldest_snapshot_seqno": -1}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4538 keys, 9460634 bytes, temperature: kUnknown
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702958888, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 9460634, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9427908, "index_size": 20264, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 115326, "raw_average_key_size": 25, "raw_value_size": 9343174, "raw_average_value_size": 2058, "num_data_blocks": 832, "num_entries": 4538, "num_filter_entries": 4538, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.959146) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 9460634 bytes
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.960833) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.0 rd, 142.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.7 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(52.4) write-amplify(23.5) OK, records in: 5049, records dropped: 511 output_compression: NoCompression
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.960856) EVENT_LOG_v1 {"time_micros": 1764398702960846, "job": 18, "event": "compaction_finished", "compaction_time_micros": 66546, "compaction_time_cpu_micros": 19907, "output_level": 6, "num_output_files": 1, "total_output_size": 9460634, "num_input_records": 5049, "num_output_records": 4538, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702961043, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702963208, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.892013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963285) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:03 compute-2 ceph-mon[77142]: pgmap v978: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:04.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:04.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:05 compute-2 ceph-mon[77142]: pgmap v979: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:06.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:06.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:08 compute-2 ceph-mon[77142]: pgmap v980: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:08.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:08.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:10 compute-2 ceph-mon[77142]: pgmap v981: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:10.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:10.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:11 compute-2 sudo[233170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:11 compute-2 sudo[233170]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:11 compute-2 sudo[233170]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-2 sudo[233195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:45:11 compute-2 sudo[233195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:11 compute-2 sudo[233195]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-2 sudo[233220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:11 compute-2 sudo[233220]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:11 compute-2 sudo[233220]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-2 sudo[233245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:45:11 compute-2 sudo[233245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:12 compute-2 ceph-mon[77142]: pgmap v982: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:12 compute-2 sudo[233245]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:12.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:12 compute-2 sudo[233302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:12 compute-2 sudo[233302]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:12 compute-2 sudo[233302]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:12 compute-2 sudo[233327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:12 compute-2 sudo[233327]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:12 compute-2 sudo[233327]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:45:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:45:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:45:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:45:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:45:13 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:45:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:14.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:14.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:14 compute-2 ceph-mon[77142]: pgmap v983: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:45:15.136 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:45:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:45:15.137 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:45:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:45:15.137 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:45:15 compute-2 ceph-mon[77142]: pgmap v984: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:16.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:16.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:18.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:18.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:18 compute-2 ceph-mon[77142]: pgmap v985: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:19 compute-2 ceph-mon[77142]: pgmap v986: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:20.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:20.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:21 compute-2 ceph-mon[77142]: pgmap v987: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:22.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:22.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:23 compute-2 sudo[233357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:23 compute-2 sudo[233357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:23 compute-2 sudo[233357]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:23 compute-2 sudo[233382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:45:23 compute-2 sudo[233382]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:23 compute-2 sudo[233382]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:24 compute-2 ceph-mon[77142]: pgmap v988: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:45:24 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:45:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:24.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:25 compute-2 ceph-mon[77142]: pgmap v989: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:26.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:26.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:27 compute-2 ceph-mon[77142]: pgmap v990: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:28.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:28.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:28 compute-2 podman[233411]: 2025-11-29 06:45:28.895852698 +0000 UTC m=+0.058520032 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 06:45:28 compute-2 podman[233410]: 2025-11-29 06:45:28.924687157 +0000 UTC m=+0.088344998 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 06:45:29 compute-2 ceph-mon[77142]: pgmap v991: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:30.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:30.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:31 compute-2 podman[233457]: 2025-11-29 06:45:31.910038571 +0000 UTC m=+0.074891558 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Nov 29 06:45:32 compute-2 ceph-mon[77142]: pgmap v992: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:32.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:32 compute-2 sudo[233478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:32 compute-2 sudo[233478]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:32 compute-2 sudo[233478]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:32 compute-2 sudo[233503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:32 compute-2 sudo[233503]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:32 compute-2 sudo[233503]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:33 compute-2 ceph-mon[77142]: pgmap v993: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:34.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:34.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:36 compute-2 ceph-mon[77142]: pgmap v994: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:36.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:36.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:37 compute-2 ceph-mon[77142]: pgmap v995: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:38.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:38.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:39 compute-2 ceph-mon[77142]: pgmap v996: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:40.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:40.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:41 compute-2 nova_compute[231979]: 2025-11-29 06:45:41.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:41 compute-2 nova_compute[231979]: 2025-11-29 06:45:41.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:41 compute-2 ceph-mon[77142]: pgmap v997: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:42.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:42.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:43 compute-2 ceph-mon[77142]: pgmap v998: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:44.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:45 compute-2 ceph-mon[77142]: pgmap v999: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:46.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:46.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:48 compute-2 ceph-mon[77142]: pgmap v1000: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:48.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:48.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:49 compute-2 ceph-mon[77142]: pgmap v1001: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:50.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:50.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:51 compute-2 ceph-mon[77142]: pgmap v1002: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:52.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:52.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:52 compute-2 nova_compute[231979]: 2025-11-29 06:45:52.881 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:52 compute-2 nova_compute[231979]: 2025-11-29 06:45:52.881 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:45:52 compute-2 nova_compute[231979]: 2025-11-29 06:45:52.881 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:45:53 compute-2 sudo[233538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:53 compute-2 sudo[233538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:53 compute-2 sudo[233538]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:53 compute-2 sudo[233563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:53 compute-2 sudo[233563]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:53 compute-2 sudo[233563]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:53 compute-2 ceph-mon[77142]: pgmap v1003: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:54.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:54.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.564 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.565 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.566 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.566 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.567 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.567 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.567 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.568 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:45:55 compute-2 nova_compute[231979]: 2025-11-29 06:45:55.568 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:55 compute-2 ceph-mon[77142]: pgmap v1004: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:56.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:56.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:57 compute-2 ceph-mon[77142]: pgmap v1005: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:58.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:45:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:58.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:59 compute-2 ceph-mon[77142]: pgmap v1006: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:59 compute-2 podman[233591]: 2025-11-29 06:45:59.925539954 +0000 UTC m=+0.085809310 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:45:59 compute-2 podman[233592]: 2025-11-29 06:45:59.934658567 +0000 UTC m=+0.087384722 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:46:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:00.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:00.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:00 compute-2 sshd-session[233633]: Invalid user ubuntu from 92.118.39.92 port 49124
Nov 29 06:46:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:00 compute-2 sshd-session[233633]: Connection closed by invalid user ubuntu 92.118.39.92 port 49124 [preauth]
Nov 29 06:46:02 compute-2 ceph-mon[77142]: pgmap v1007: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:02.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:02 compute-2 podman[233637]: 2025-11-29 06:46:02.890988708 +0000 UTC m=+0.057265939 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 06:46:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1002726251' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:46:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1002726251' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:46:04 compute-2 ceph-mon[77142]: pgmap v1008: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:04.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:04.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:06 compute-2 ceph-mon[77142]: pgmap v1009: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:06.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:06.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:08 compute-2 ceph-mon[77142]: pgmap v1010: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:08.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:08.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.008 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:46:09 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:46:09 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2861061763' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.430 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.555 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.42 sec
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.589 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.590 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5304MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.591 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:09 compute-2 nova_compute[231979]: 2025-11-29 06:46:09.591 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:10 compute-2 ceph-mon[77142]: pgmap v1011: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:10 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2861061763' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:10 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3232813680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:10 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3671393982' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:10.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:10.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:10 compute-2 nova_compute[231979]: 2025-11-29 06:46:10.593 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:46:10 compute-2 nova_compute[231979]: 2025-11-29 06:46:10.593 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:46:10 compute-2 nova_compute[231979]: 2025-11-29 06:46:10.631 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:46:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:46:11 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/616557685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:11 compute-2 nova_compute[231979]: 2025-11-29 06:46:11.054 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:46:11 compute-2 nova_compute[231979]: 2025-11-29 06:46:11.059 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:46:11 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/616557685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:11 compute-2 nova_compute[231979]: 2025-11-29 06:46:11.865 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:46:11 compute-2 nova_compute[231979]: 2025-11-29 06:46:11.867 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:46:11 compute-2 nova_compute[231979]: 2025-11-29 06:46:11.867 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:12.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:12.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:12 compute-2 ceph-mon[77142]: pgmap v1012: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:12 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3071034240' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:12 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1964139985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:13 compute-2 sudo[233708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:13 compute-2 sudo[233708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:13 compute-2 sudo[233708]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:13 compute-2 sudo[233733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:13 compute-2 sudo[233733]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:13 compute-2 sudo[233733]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:13 compute-2 ceph-mon[77142]: pgmap v1013: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:14.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:14.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:46:15.138 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:46:15.139 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:46:15.139 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:15 compute-2 ceph-mon[77142]: pgmap v1014: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:16.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:16.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:17 compute-2 ceph-mon[77142]: pgmap v1015: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:18.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:18.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:19 compute-2 ceph-mon[77142]: pgmap v1016: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:20.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:20.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:21 compute-2 ceph-mon[77142]: pgmap v1017: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:22.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:22.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:23 compute-2 sudo[233763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:23 compute-2 sudo[233763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:23 compute-2 sudo[233763]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:23 compute-2 sudo[233788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:46:23 compute-2 sudo[233788]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:23 compute-2 sudo[233788]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:23 compute-2 sudo[233813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:23 compute-2 sudo[233813]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:23 compute-2 sudo[233813]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:24 compute-2 sudo[233838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:46:24 compute-2 sudo[233838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:24 compute-2 ceph-mon[77142]: pgmap v1018: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:24.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:24.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:24 compute-2 sudo[233838]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:46:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:46:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:46:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:46:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:46:25 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:46:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:26 compute-2 ceph-mon[77142]: pgmap v1019: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:26.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:26.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:28 compute-2 ceph-mon[77142]: pgmap v1020: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:28.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:28.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:29 compute-2 ceph-mon[77142]: pgmap v1021: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:30.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:30.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:30 compute-2 podman[233899]: 2025-11-29 06:46:30.898850463 +0000 UTC m=+0.058512442 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:46:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:30 compute-2 podman[233898]: 2025-11-29 06:46:30.924731933 +0000 UTC m=+0.086982471 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:46:32 compute-2 ceph-mon[77142]: pgmap v1022: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:32.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:32.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:33 compute-2 sudo[233942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:33 compute-2 sudo[233942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:33 compute-2 sudo[233942]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:33 compute-2 sudo[233973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:46:33 compute-2 podman[233966]: 2025-11-29 06:46:33.21815158 +0000 UTC m=+0.054921057 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 29 06:46:33 compute-2 sudo[233973]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:33 compute-2 sudo[233973]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:33 compute-2 sudo[234011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:33 compute-2 sudo[234011]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:33 compute-2 sudo[234011]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:33 compute-2 sudo[234036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:33 compute-2 sudo[234036]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:33 compute-2 sudo[234036]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:46:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:46:33 compute-2 ceph-mon[77142]: pgmap v1023: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:34.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:34.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:35 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:46:35.706 143385 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:05:03', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'aa:d2:09:dd:a5:e1'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:46:35 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:46:35.707 143385 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:46:35 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:46:35.708 143385 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:46:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:36 compute-2 ceph-mon[77142]: pgmap v1024: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:36.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:36.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:37 compute-2 ceph-mon[77142]: pgmap v1025: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:38.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda3423c6f0 =====
Nov 29 06:46:38 compute-2 radosgw[83467]: ====== req done req=0x7fda3423c6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:38 compute-2 radosgw[83467]: beast: 0x7fda3423c6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:38.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:40 compute-2 ceph-mon[77142]: pgmap v1026: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:40.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:40.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:41 compute-2 ceph-mon[77142]: pgmap v1027: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:42.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:42.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:43 compute-2 ceph-mon[77142]: pgmap v1028: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:44.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:44.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:46 compute-2 ceph-mon[77142]: pgmap v1029: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:46.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:46.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:47 compute-2 ceph-mon[77142]: pgmap v1030: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:48.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:48.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:50 compute-2 ceph-mon[77142]: pgmap v1031: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:50.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:50.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:51 compute-2 ceph-mon[77142]: pgmap v1032: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:52.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:52.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:53 compute-2 sudo[234072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:53 compute-2 sudo[234072]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:53 compute-2 sudo[234072]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:53 compute-2 sudo[234097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:53 compute-2 sudo[234097]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:53 compute-2 sudo[234097]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:54 compute-2 ceph-mon[77142]: pgmap v1033: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:54.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:54.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:55 compute-2 ceph-mon[77142]: pgmap v1034: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:56.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:56.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:58.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:46:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:58.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:58 compute-2 ceph-mon[77142]: pgmap v1035: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:59 compute-2 ceph-mon[77142]: pgmap v1036: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:00.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:00.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:01 compute-2 podman[234127]: 2025-11-29 06:47:01.885661555 +0000 UTC m=+0.050208151 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 06:47:01 compute-2 podman[234126]: 2025-11-29 06:47:01.916766145 +0000 UTC m=+0.083790307 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 06:47:02 compute-2 ceph-mon[77142]: pgmap v1037: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:02.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:02.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3066267713' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:47:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3066267713' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:47:03 compute-2 ceph-mon[77142]: pgmap v1038: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:03 compute-2 podman[234176]: 2025-11-29 06:47:03.884410431 +0000 UTC m=+0.051498675 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:47:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:04.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:04.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:06.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:06 compute-2 ceph-mon[77142]: pgmap v1039: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:06.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:08 compute-2 ceph-mon[77142]: pgmap v1040: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:08.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:08.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:09 compute-2 ceph-mon[77142]: pgmap v1041: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:10.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:10.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:11 compute-2 ceph-mon[77142]: pgmap v1042: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:11 compute-2 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:11 compute-2 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:11 compute-2 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:47:11 compute-2 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:47:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:12.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:12.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.712 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.713 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.713 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.713 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:47:13 compute-2 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-2 sudo[234202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:13 compute-2 sudo[234202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:13 compute-2 sudo[234202]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:13 compute-2 sudo[234227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:13 compute-2 sudo[234227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:13 compute-2 sudo[234227]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:14 compute-2 ceph-mon[77142]: pgmap v1043: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.060 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.060 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.061 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.061 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.061 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:14.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:14 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:14 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3335706329' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:14.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.529 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.692 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.694 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5328MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.694 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:14 compute-2 nova_compute[231979]: 2025-11-29 06:47:14.694 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:15 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1028868458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:15 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2186349468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:15 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3335706329' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:47:15.139 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:47:15.140 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:47:15.140 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:15 compute-2 nova_compute[231979]: 2025-11-29 06:47:15.955 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:47:15 compute-2 nova_compute[231979]: 2025-11-29 06:47:15.955 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:47:15 compute-2 nova_compute[231979]: 2025-11-29 06:47:15.991 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:16 compute-2 ceph-mon[77142]: pgmap v1044: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:16 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/209762820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:16 compute-2 nova_compute[231979]: 2025-11-29 06:47:16.448 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:16 compute-2 nova_compute[231979]: 2025-11-29 06:47:16.454 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:47:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:16.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:16.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:16 compute-2 nova_compute[231979]: 2025-11-29 06:47:16.814 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:47:16 compute-2 nova_compute[231979]: 2025-11-29 06:47:16.816 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:47:16 compute-2 nova_compute[231979]: 2025-11-29 06:47:16.816 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:17 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1939421518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:17 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/209762820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:17 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1847130721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.802 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.802 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.833 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.833 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.833 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.859 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.860 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.860 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.860 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:47:17 compute-2 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.218 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.219 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.219 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.219 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.220 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:18 compute-2 ceph-mon[77142]: pgmap v1045: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:18.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:18 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/687770592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.682 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.846 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.847 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5281MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.847 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:18 compute-2 nova_compute[231979]: 2025-11-29 06:47:18.848 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:19 compute-2 nova_compute[231979]: 2025-11-29 06:47:19.361 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:47:19 compute-2 nova_compute[231979]: 2025-11-29 06:47:19.362 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:47:19 compute-2 nova_compute[231979]: 2025-11-29 06:47:19.381 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:19 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/687770592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:19 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1602380716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:19 compute-2 ceph-mon[77142]: pgmap v1046: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:19 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:19 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/728103463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:19 compute-2 nova_compute[231979]: 2025-11-29 06:47:19.867 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:19 compute-2 nova_compute[231979]: 2025-11-29 06:47:19.872 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:47:20 compute-2 nova_compute[231979]: 2025-11-29 06:47:20.000 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:47:20 compute-2 nova_compute[231979]: 2025-11-29 06:47:20.002 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:47:20 compute-2 nova_compute[231979]: 2025-11-29 06:47:20.003 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:20.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:20.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:20 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/728103463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:20 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2414150286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:21 compute-2 nova_compute[231979]: 2025-11-29 06:47:21.003 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:21 compute-2 ceph-mon[77142]: pgmap v1047: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:22.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:22.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:23 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2547278201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:24 compute-2 ceph-mon[77142]: pgmap v1048: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:24.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:24.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:25 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3841899284' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:26 compute-2 ceph-mon[77142]: pgmap v1049: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:26.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:26.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:27 compute-2 ceph-mon[77142]: pgmap v1050: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:28.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:28.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:30 compute-2 ceph-mon[77142]: pgmap v1051: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:30.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:30.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:32 compute-2 ceph-mon[77142]: pgmap v1052: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:32.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:32.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:32 compute-2 podman[234351]: 2025-11-29 06:47:32.914870034 +0000 UTC m=+0.067242154 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 06:47:32 compute-2 podman[234350]: 2025-11-29 06:47:32.922763252 +0000 UTC m=+0.087498888 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller)
Nov 29 06:47:33 compute-2 sudo[234397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:33 compute-2 sudo[234397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-2 sudo[234397]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:33 compute-2 sudo[234422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:47:33 compute-2 sudo[234422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-2 sudo[234422]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:33 compute-2 sudo[234447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:33 compute-2 sudo[234447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-2 sudo[234447]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:33 compute-2 sudo[234472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:47:33 compute-2 sudo[234472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-2 sudo[234472]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:34 compute-2 ceph-mon[77142]: pgmap v1053: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:34 compute-2 sudo[234529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:34 compute-2 sudo[234529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:34 compute-2 sudo[234529]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:34 compute-2 sudo[234560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:34 compute-2 sudo[234560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:34 compute-2 sudo[234560]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:34 compute-2 podman[234553]: 2025-11-29 06:47:34.366966985 +0000 UTC m=+0.096009563 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:47:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000053s ======
Nov 29 06:47:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:34.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Nov 29 06:47:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:34.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:47:35 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:47:35 compute-2 ceph-mon[77142]: pgmap v1054: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:36.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:36.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:38 compute-2 ceph-mon[77142]: pgmap v1055: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:38.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:38.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:40 compute-2 ceph-mon[77142]: pgmap v1056: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:40.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:40.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:41 compute-2 ceph-mon[77142]: pgmap v1057: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:42.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:42.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:42 compute-2 sudo[234605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:42 compute-2 sudo[234605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:42 compute-2 sudo[234605]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:42 compute-2 sudo[234630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:47:42 compute-2 sudo[234630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:42 compute-2 sudo[234630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:43 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:43 compute-2 ceph-mon[77142]: pgmap v1058: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:44.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:44.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:46 compute-2 ceph-mon[77142]: pgmap v1059: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:46.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:46.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:47 compute-2 ceph-mon[77142]: pgmap v1060: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:48.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:48.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:50 compute-2 ceph-mon[77142]: pgmap v1061: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:50.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:50.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:51 compute-2 ceph-mon[77142]: pgmap v1062: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:52.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:52.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:54 compute-2 ceph-mon[77142]: pgmap v1063: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:54 compute-2 sudo[234660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:54 compute-2 sudo[234660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:54 compute-2 sudo[234660]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:54 compute-2 sudo[234685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:54 compute-2 sudo[234685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:54 compute-2 sudo[234685]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:54.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:54.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:55 compute-2 ceph-mon[77142]: pgmap v1064: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:56.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.076353) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877076425, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1875, "num_deletes": 251, "total_data_size": 4554771, "memory_usage": 4605256, "flush_reason": "Manual Compaction"}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877090092, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1737351, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18533, "largest_seqno": 20403, "table_properties": {"data_size": 1731823, "index_size": 2668, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14351, "raw_average_key_size": 20, "raw_value_size": 1719513, "raw_average_value_size": 2428, "num_data_blocks": 123, "num_entries": 708, "num_filter_entries": 708, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398703, "oldest_key_time": 1764398703, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 13764 microseconds, and 4941 cpu microseconds.
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.090126) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1737351 bytes OK
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.090142) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092088) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092102) EVENT_LOG_v1 {"time_micros": 1764398877092098, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092115) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 4546375, prev total WAL file size 4546375, number of live WAL files 2.
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092978) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373535' seq:0, type:0; will stop at (end)
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1696KB)], [36(9238KB)]
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877092997, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11197985, "oldest_snapshot_seqno": -1}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4812 keys, 8586067 bytes, temperature: kUnknown
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877153042, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 8586067, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8553880, "index_size": 19085, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12037, "raw_key_size": 121250, "raw_average_key_size": 25, "raw_value_size": 8466690, "raw_average_value_size": 1759, "num_data_blocks": 783, "num_entries": 4812, "num_filter_entries": 4812, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.153417) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 8586067 bytes
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.154600) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.9 rd, 142.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.0 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(11.4) write-amplify(4.9) OK, records in: 5246, records dropped: 434 output_compression: NoCompression
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.154617) EVENT_LOG_v1 {"time_micros": 1764398877154609, "job": 20, "event": "compaction_finished", "compaction_time_micros": 60221, "compaction_time_cpu_micros": 20626, "output_level": 6, "num_output_files": 1, "total_output_size": 8586067, "num_input_records": 5246, "num_output_records": 4812, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877155640, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877157538, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157684) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157688) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:58 compute-2 ceph-mon[77142]: pgmap v1065: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:58.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:47:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:59 compute-2 nova_compute[231979]: 2025-11-29 06:47:59.100 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 9.56 sec
Nov 29 06:47:59 compute-2 ceph-mon[77142]: pgmap v1066: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:00.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:00.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:01 compute-2 anacron[30873]: Job `cron.weekly' started
Nov 29 06:48:01 compute-2 anacron[30873]: Job `cron.weekly' terminated
Nov 29 06:48:02 compute-2 ceph-mon[77142]: pgmap v1067: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:02.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:48:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:48:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:48:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:48:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:48:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:48:03 compute-2 podman[234718]: 2025-11-29 06:48:03.888913492 +0000 UTC m=+0.050942485 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:48:03 compute-2 podman[234717]: 2025-11-29 06:48:03.91274735 +0000 UTC m=+0.078230844 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:48:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:04.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:04.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:04 compute-2 podman[234759]: 2025-11-29 06:48:04.919877947 +0000 UTC m=+0.083123483 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:48:05 compute-2 ceph-mon[77142]: pgmap v1068: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:06 compute-2 ceph-mon[77142]: pgmap v1069: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:06.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:06.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:07 compute-2 ceph-mon[77142]: pgmap v1070: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:08.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:08.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:10 compute-2 ceph-mon[77142]: pgmap v1071: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:10.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:10.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:10 compute-2 nova_compute[231979]: 2025-11-29 06:48:10.723 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 1.62 sec
Nov 29 06:48:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:11 compute-2 ceph-mon[77142]: pgmap v1072: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:12.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:12.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:12 compute-2 nova_compute[231979]: 2025-11-29 06:48:12.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:12 compute-2 nova_compute[231979]: 2025-11-29 06:48:12.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:48:14 compute-2 ceph-mon[77142]: pgmap v1073: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:14.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:14 compute-2 sudo[234786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:14.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:14 compute-2 sudo[234786]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:14 compute-2 sudo[234786]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:14 compute-2 sshd-session[234783]: Invalid user ubuntu from 92.118.39.92 port 42542
Nov 29 06:48:14 compute-2 sudo[234811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:14 compute-2 sudo[234811]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:14 compute-2 sudo[234811]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:14 compute-2 sshd-session[234783]: Connection closed by invalid user ubuntu 92.118.39.92 port 42542 [preauth]
Nov 29 06:48:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:48:15.141 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:48:15.142 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:48:15.142 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:16 compute-2 ceph-mon[77142]: pgmap v1074: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:16.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:16.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:17 compute-2 ceph-mon[77142]: pgmap v1075: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:18.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:18.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:19 compute-2 ceph-mon[77142]: pgmap v1076: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:20.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:20.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:22 compute-2 ceph-mon[77142]: pgmap v1077: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:22.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:22.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:24.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:24 compute-2 ceph-mon[77142]: pgmap v1078: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:24.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:25 compute-2 ceph-mon[77142]: pgmap v1079: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:26.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:26.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:28 compute-2 ceph-mon[77142]: pgmap v1080: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:28.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:28.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:28 compute-2 nova_compute[231979]: 2025-11-29 06:48:28.747 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:48:28 compute-2 nova_compute[231979]: 2025-11-29 06:48:28.748 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:28 compute-2 nova_compute[231979]: 2025-11-29 06:48:28.748 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:48:29 compute-2 ceph-mon[77142]: pgmap v1081: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:30.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:30.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:31 compute-2 ceph-mon[77142]: pgmap v1082: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:31 compute-2 nova_compute[231979]: 2025-11-29 06:48:31.889 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 11.17 sec
Nov 29 06:48:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:32.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:32.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:34 compute-2 ceph-mon[77142]: pgmap v1083: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:34.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:34.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:34 compute-2 sudo[234846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:34 compute-2 sudo[234846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:34 compute-2 sudo[234846]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:34 compute-2 podman[234871]: 2025-11-29 06:48:34.887896125 +0000 UTC m=+0.047909495 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:48:34 compute-2 sudo[234886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:34 compute-2 sudo[234886]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:34 compute-2 sudo[234886]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:34 compute-2 podman[234869]: 2025-11-29 06:48:34.918763078 +0000 UTC m=+0.079909928 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:48:35 compute-2 podman[234942]: 2025-11-29 06:48:35.88623943 +0000 UTC m=+0.053582604 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:48:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:36 compute-2 ceph-mon[77142]: pgmap v1084: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:36.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:36.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:38 compute-2 ceph-mon[77142]: pgmap v1085: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:38.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:38.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:40 compute-2 ceph-mon[77142]: pgmap v1086: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:40.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:41 compute-2 ceph-mon[77142]: pgmap v1087: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:42.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:42.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:42 compute-2 sudo[234967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:42 compute-2 sudo[234967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:42 compute-2 sudo[234967]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:42 compute-2 sudo[234992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:48:42 compute-2 sudo[234992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:42 compute-2 sudo[234992]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:42 compute-2 sudo[235017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:42 compute-2 sudo[235017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:42 compute-2 sudo[235017]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:42 compute-2 sudo[235042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:48:42 compute-2 sudo[235042]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:43 compute-2 ceph-mon[77142]: pgmap v1088: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:43 compute-2 sudo[235042]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:43 compute-2 sudo[235097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:43 compute-2 sudo[235097]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:43 compute-2 sudo[235097]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:43 compute-2 sudo[235122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:48:43 compute-2 sudo[235122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:43 compute-2 sudo[235122]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:43 compute-2 sudo[235147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:43 compute-2 sudo[235147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:43 compute-2 sudo[235147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:43 compute-2 sudo[235172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 list-networks
Nov 29 06:48:43 compute-2 sudo[235172]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:43 compute-2 sudo[235172]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:44.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:44.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.076911) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925076984, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 711, "num_deletes": 251, "total_data_size": 1330634, "memory_usage": 1350960, "flush_reason": "Manual Compaction"}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925084585, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 878455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20409, "largest_seqno": 21114, "table_properties": {"data_size": 874937, "index_size": 1426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7933, "raw_average_key_size": 19, "raw_value_size": 867897, "raw_average_value_size": 2121, "num_data_blocks": 62, "num_entries": 409, "num_filter_entries": 409, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398877, "oldest_key_time": 1764398877, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 7706 microseconds, and 3291 cpu microseconds.
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.084628) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 878455 bytes OK
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.084645) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086081) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086094) EVENT_LOG_v1 {"time_micros": 1764398925086090, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086109) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1326846, prev total WAL file size 1326846, number of live WAL files 2.
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086579) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(857KB)], [39(8384KB)]
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925086609, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 9464522, "oldest_snapshot_seqno": -1}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4703 keys, 7370470 bytes, temperature: kUnknown
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925137967, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 7370470, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7340060, "index_size": 17564, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11781, "raw_key_size": 119556, "raw_average_key_size": 25, "raw_value_size": 7255747, "raw_average_value_size": 1542, "num_data_blocks": 714, "num_entries": 4703, "num_filter_entries": 4703, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.138227) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 7370470 bytes
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.139626) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.9 rd, 143.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 8.2 +0.0 blob) out(7.0 +0.0 blob), read-write-amplify(19.2) write-amplify(8.4) OK, records in: 5221, records dropped: 518 output_compression: NoCompression
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.139646) EVENT_LOG_v1 {"time_micros": 1764398925139637, "job": 22, "event": "compaction_finished", "compaction_time_micros": 51458, "compaction_time_cpu_micros": 15666, "output_level": 6, "num_output_files": 1, "total_output_size": 7370470, "num_input_records": 5221, "num_output_records": 4703, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925139943, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925141696, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086527) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141736) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:46 compute-2 ceph-mon[77142]: pgmap v1089: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:46.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:46.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:47 compute-2 ceph-mon[77142]: pgmap v1090: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:48.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:48.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:50 compute-2 ceph-mon[77142]: pgmap v1091: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:50 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:50 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:50 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:48:50 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:48:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:50.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:50.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:51 compute-2 nova_compute[231979]: 2025-11-29 06:48:51.093 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:48:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:48:51 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:48:51 compute-2 ceph-mon[77142]: pgmap v1092: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:52 compute-2 nova_compute[231979]: 2025-11-29 06:48:52.021 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 10.13 sec
Nov 29 06:48:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:52.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:48:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:52.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:48:54 compute-2 ceph-mon[77142]: pgmap v1093: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:54.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:54.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:55 compute-2 sudo[235221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:55 compute-2 sudo[235221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:55 compute-2 sudo[235221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:55 compute-2 sudo[235246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:55 compute-2 sudo[235246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:55 compute-2 sudo[235246]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:56 compute-2 ceph-mon[77142]: pgmap v1094: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:56.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:56.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:57 compute-2 ceph-mon[77142]: pgmap v1095: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:58.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:48:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:58 compute-2 sudo[235273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:58 compute-2 sudo[235273]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:58 compute-2 sudo[235273]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:58 compute-2 sudo[235298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:48:58 compute-2 sudo[235298]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:58 compute-2 sudo[235298]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:59 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:59 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:59 compute-2 ceph-mon[77142]: pgmap v1096: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:00.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:00.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:01 compute-2 ceph-mon[77142]: pgmap v1097: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:01 compute-2 nova_compute[231979]: 2025-11-29 06:49:01.822 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:01 compute-2 nova_compute[231979]: 2025-11-29 06:49:01.823 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:01 compute-2 nova_compute[231979]: 2025-11-29 06:49:01.823 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:49:01 compute-2 nova_compute[231979]: 2025-11-29 06:49:01.823 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:49:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:02.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:02.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:02 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:49:02 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:49:03 compute-2 ceph-mon[77142]: pgmap v1098: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:04.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:04.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:05 compute-2 podman[235327]: 2025-11-29 06:49:05.903774771 +0000 UTC m=+0.059671634 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 06:49:05 compute-2 podman[235326]: 2025-11-29 06:49:05.932710274 +0000 UTC m=+0.088636338 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 06:49:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:05 compute-2 podman[235367]: 2025-11-29 06:49:05.995023297 +0000 UTC m=+0.057841256 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 06:49:06 compute-2 ceph-mon[77142]: pgmap v1099: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:06.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:06.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:08.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:08.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:09 compute-2 ceph-mon[77142]: pgmap v1100: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:10 compute-2 ceph-mon[77142]: pgmap v1101: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:10.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:10.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:11 compute-2 ceph-mon[77142]: pgmap v1102: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:12.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:12.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:14 compute-2 ceph-mon[77142]: pgmap v1103: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:14.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:14.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:49:15.143 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:49:15.143 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:49:15.143 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:15 compute-2 sudo[235397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:15 compute-2 sudo[235397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:15 compute-2 sudo[235397]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:15 compute-2 sudo[235422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:15 compute-2 sudo[235422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:15 compute-2 sudo[235422]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:15 compute-2 ceph-mon[77142]: pgmap v1104: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:16.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:16.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:18 compute-2 ceph-mon[77142]: pgmap v1105: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:18.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:49:18 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 3893 writes, 21K keys, 3893 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s
                                           Cumulative WAL: 3893 writes, 3893 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1412 writes, 7260 keys, 1412 commit groups, 1.0 writes per commit group, ingest: 15.00 MB, 0.02 MB/s
                                           Interval WAL: 1412 writes, 1412 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     32.4      0.81              0.07        11    0.074       0      0       0.0       0.0
                                             L6      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    120.7     99.7      0.91              0.24        10    0.091     49K   5241       0.0       0.0
                                            Sum      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     63.9     68.0      1.72              0.30        21    0.082     49K   5241       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.7    131.7    125.5      0.49              0.17        12    0.041     31K   3467       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    120.7     99.7      0.91              0.24        10    0.091     49K   5241       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     32.4      0.81              0.07        10    0.081       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.026, interval 0.009
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.11 GB write, 0.06 MB/s write, 0.11 GB read, 0.06 MB/s read, 1.7 seconds
                                           Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 304.00 MB usage: 8.06 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 6.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(419,7.66 MB,2.52095%) FilterBlock(21,140.86 KB,0.0452493%) IndexBlock(21,268.67 KB,0.0863075%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:49:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:18.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:20 compute-2 ceph-mon[77142]: pgmap v1106: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:20.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:20.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:22 compute-2 ceph-mon[77142]: pgmap v1107: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:22.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:22.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:24.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:24 compute-2 ceph-mon[77142]: pgmap v1108: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:26.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:27 compute-2 ceph-mon[77142]: pgmap v1109: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:28.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:28.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:28 compute-2 ceph-mon[77142]: pgmap v1110: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:29 compute-2 ceph-mon[77142]: pgmap v1111: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:30.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:30.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:32 compute-2 ceph-mon[77142]: pgmap v1112: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:32.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:32.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.164 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.166 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.166 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:49:34 compute-2 nova_compute[231979]: 2025-11-29 06:49:34.166 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-2 ceph-mon[77142]: pgmap v1113: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:34.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:34.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:35 compute-2 ceph-mon[77142]: pgmap v1114: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:35 compute-2 sudo[235457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:35 compute-2 sudo[235457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:35 compute-2 sudo[235457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:35 compute-2 sudo[235482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:35 compute-2 sudo[235482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:35 compute-2 sudo[235482]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:35 compute-2 nova_compute[231979]: 2025-11-29 06:49:35.461 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 23.44 sec
Nov 29 06:49:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:36.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:36.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:36 compute-2 podman[235510]: 2025-11-29 06:49:36.900214088 +0000 UTC m=+0.058016661 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:49:36 compute-2 podman[235509]: 2025-11-29 06:49:36.902219981 +0000 UTC m=+0.061140834 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:49:36 compute-2 podman[235508]: 2025-11-29 06:49:36.917130184 +0000 UTC m=+0.082409074 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:49:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:38.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:38 compute-2 ceph-mon[77142]: pgmap v1115: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:38.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:40 compute-2 ceph-mon[77142]: pgmap v1116: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:40.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:40.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:42 compute-2 ceph-mon[77142]: pgmap v1117: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:42.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:42.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:43 compute-2 ceph-mon[77142]: pgmap v1118: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:44.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:44.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:46 compute-2 ceph-mon[77142]: pgmap v1119: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:46.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:46.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:47 compute-2 ceph-mon[77142]: pgmap v1120: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:49:48 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 6280 writes, 25K keys, 6280 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6280 writes, 1161 syncs, 5.41 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 484 writes, 738 keys, 484 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s
                                           Interval WAL: 484 writes, 238 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 06:49:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:48.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:48.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:50 compute-2 ceph-mon[77142]: pgmap v1121: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:50.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:50.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.116 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.116 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.117 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.117 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.117 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:49:51 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2261730173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.534 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.681 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.682 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5312MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.682 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:51 compute-2 nova_compute[231979]: 2025-11-29 06:49:51.683 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:52 compute-2 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:49:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:52.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:52.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:53 compute-2 ceph-mon[77142]: pgmap v1122: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2261730173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1398036893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1336302150' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:54 compute-2 ceph-mon[77142]: pgmap v1123: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:54.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:54.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:55 compute-2 sudo[235603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:55 compute-2 sudo[235603]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:55 compute-2 sudo[235603]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:55 compute-2 sudo[235628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:55 compute-2 sudo[235628]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:55 compute-2 sudo[235628]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:56 compute-2 ceph-mon[77142]: pgmap v1124: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:56.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:56.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:58 compute-2 ceph-mon[77142]: pgmap v1125: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:58.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:49:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:58.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:59 compute-2 sudo[235655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:59 compute-2 sudo[235655]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-2 sudo[235655]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-2 sudo[235680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:49:59 compute-2 sudo[235680]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-2 sudo[235680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-2 sudo[235705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:59 compute-2 sudo[235705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-2 sudo[235705]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-2 sudo[235730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:49:59 compute-2 sudo[235730]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-2 ceph-mon[77142]: pgmap v1126: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:59 compute-2 sudo[235730]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:00.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:00 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-2 ceph-mon[77142]: overall HEALTH_OK
Nov 29 06:50:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:00.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:02 compute-2 nova_compute[231979]: 2025-11-29 06:50:02.207 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 16.75 sec
Nov 29 06:50:02 compute-2 ceph-mon[77142]: pgmap v1127: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:50:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:50:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:50:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:50:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:02.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:02.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:04.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:04.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:05 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:50:05 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:50:05 compute-2 ceph-mon[77142]: pgmap v1128: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:06 compute-2 ceph-mon[77142]: pgmap v1129: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:06.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:06.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:50:07 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:50:07 compute-2 ceph-mon[77142]: pgmap v1130: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:07 compute-2 podman[235791]: 2025-11-29 06:50:07.916849596 +0000 UTC m=+0.077880319 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 29 06:50:07 compute-2 podman[235790]: 2025-11-29 06:50:07.922186259 +0000 UTC m=+0.085458862 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 06:50:07 compute-2 podman[235792]: 2025-11-29 06:50:07.92221843 +0000 UTC m=+0.080458628 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 29 06:50:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:08.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:08.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:09 compute-2 ceph-mon[77142]: pgmap v1131: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:10.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:10.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:11 compute-2 ceph-mon[77142]: pgmap v1132: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:12.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:12.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:14 compute-2 ceph-mon[77142]: pgmap v1133: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:14.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:14.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:50:15.144 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:50:15.144 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:50:15.144 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:15 compute-2 sudo[235857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:15 compute-2 sudo[235857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:15 compute-2 sudo[235857]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:15 compute-2 sudo[235882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:15 compute-2 sudo[235882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:15 compute-2 sudo[235882]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:15 compute-2 ceph-mon[77142]: pgmap v1134: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:16.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:16.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:17 compute-2 ceph-mon[77142]: pgmap v1135: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:18 compute-2 sudo[235909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:18 compute-2 sudo[235909]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:18 compute-2 sudo[235909]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:18 compute-2 sudo[235934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:50:18 compute-2 sudo[235934]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:18 compute-2 sudo[235934]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:18.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:18.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:19 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:20 compute-2 ceph-mon[77142]: pgmap v1136: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:20.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:20.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:21 compute-2 ceph-mon[77142]: pgmap v1137: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:22.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:22.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:22 compute-2 nova_compute[231979]: 2025-11-29 06:50:22.839 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 10.63 sec
Nov 29 06:50:23 compute-2 sshd-session[235961]: Invalid user ubuntu from 92.118.39.92 port 35960
Nov 29 06:50:23 compute-2 sshd-session[235961]: Connection closed by invalid user ubuntu 92.118.39.92 port 35960 [preauth]
Nov 29 06:50:24 compute-2 ceph-mon[77142]: pgmap v1138: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:24.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:24.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:26 compute-2 ceph-mon[77142]: pgmap v1139: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:26.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:26.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:27 compute-2 ceph-mon[77142]: pgmap v1140: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:28.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:28.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:30 compute-2 ceph-mon[77142]: pgmap v1141: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:30.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:30.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:31 compute-2 ceph-mon[77142]: pgmap v1142: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:32.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:32.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:33 compute-2 ceph-mon[77142]: pgmap v1143: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:34.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:34.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:35 compute-2 sudo[235970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:35 compute-2 sudo[235970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:35 compute-2 sudo[235970]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:35 compute-2 sudo[235995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:35 compute-2 sudo[235995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:35 compute-2 sudo[235995]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:36 compute-2 ceph-mon[77142]: pgmap v1144: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:36.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:36.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:37 compute-2 ceph-mon[77142]: pgmap v1145: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:38.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:38.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:38 compute-2 podman[236023]: 2025-11-29 06:50:38.893957828 +0000 UTC m=+0.055692474 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:50:38 compute-2 podman[236024]: 2025-11-29 06:50:38.901571352 +0000 UTC m=+0.060788081 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:50:38 compute-2 podman[236022]: 2025-11-29 06:50:38.924191379 +0000 UTC m=+0.086464430 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.569 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.570 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.588 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing inventories for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.604 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating ProviderTree inventory for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.605 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.640 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing aggregate associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.680 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing trait associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:50:39 compute-2 nova_compute[231979]: 2025-11-29 06:50:39.725 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:50:40 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3662469046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:40 compute-2 nova_compute[231979]: 2025-11-29 06:50:40.351 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.627s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:40 compute-2 ceph-mon[77142]: pgmap v1146: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:40 compute-2 nova_compute[231979]: 2025-11-29 06:50:40.358 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:50:40 compute-2 nova_compute[231979]: 2025-11-29 06:50:40.415 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.58 sec
Nov 29 06:50:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:40.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:40.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:41 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3008365598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:41 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3662469046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:41 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/4231777605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:41 compute-2 ceph-mon[77142]: pgmap v1147: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:42.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:44 compute-2 ceph-mon[77142]: pgmap v1148: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:44.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:44.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:45 compute-2 ceph-mon[77142]: pgmap v1149: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:46.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:46.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:48 compute-2 ceph-mon[77142]: pgmap v1150: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:48.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:48.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:50 compute-2 ceph-mon[77142]: pgmap v1151: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:50.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:50.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:52 compute-2 ceph-mon[77142]: pgmap v1152: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:52.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:52.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:53 compute-2 ceph-mon[77142]: pgmap v1153: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:54.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:54.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:55 compute-2 ceph-mon[77142]: pgmap v1154: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:56 compute-2 sudo[236119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:56 compute-2 sudo[236119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:56 compute-2 sudo[236119]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:56 compute-2 sudo[236144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:56 compute-2 sudo[236144]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:56 compute-2 sudo[236144]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:56.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:56.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:58 compute-2 ceph-mon[77142]: pgmap v1155: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:58.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:50:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:58.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:00 compute-2 ceph-mon[77142]: pgmap v1156: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:00.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:01 compute-2 ceph-mon[77142]: pgmap v1157: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:02 compute-2 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 29 06:51:02 compute-2 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Nov 29 06:51:02 compute-2 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Nov 29 06:51:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:51:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:51:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:51:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:51:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:02.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:02.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:51:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:51:03 compute-2 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Nov 29 06:51:03 compute-2 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 29 06:51:04 compute-2 ceph-mon[77142]: pgmap v1158: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:04.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:04.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:06 compute-2 ceph-mon[77142]: pgmap v1159: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 4.2 KiB/s rd, 0 B/s wr, 7 op/s
Nov 29 06:51:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:06.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:06.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:07 compute-2 ceph-mon[77142]: pgmap v1160: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 54 KiB/s rd, 0 B/s wr, 90 op/s
Nov 29 06:51:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:08.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:08.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:09 compute-2 podman[236177]: 2025-11-29 06:51:09.899108305 +0000 UTC m=+0.052607192 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:51:09 compute-2 podman[236178]: 2025-11-29 06:51:09.910061698 +0000 UTC m=+0.057341818 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:51:09 compute-2 podman[236176]: 2025-11-29 06:51:09.940664929 +0000 UTC m=+0.087053935 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:51:10 compute-2 ceph-mon[77142]: pgmap v1161: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 74 KiB/s rd, 0 B/s wr, 123 op/s
Nov 29 06:51:10 compute-2 nova_compute[231979]: 2025-11-29 06:51:10.198 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:51:10 compute-2 nova_compute[231979]: 2025-11-29 06:51:10.199 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:51:10 compute-2 nova_compute[231979]: 2025-11-29 06:51:10.200 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 78.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:10.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:10.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:10 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:11 compute-2 nova_compute[231979]: 2025-11-29 06:51:11.695 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 21.28 sec
Nov 29 06:51:12 compute-2 ceph-mon[77142]: pgmap v1162: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 74 KiB/s rd, 0 B/s wr, 123 op/s
Nov 29 06:51:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:12.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:14 compute-2 ceph-mon[77142]: pgmap v1163: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 82 KiB/s rd, 0 B/s wr, 136 op/s
Nov 29 06:51:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:14.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:14.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:51:15.145 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:51:15.146 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:51:15.146 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:15 compute-2 ceph-mon[77142]: pgmap v1164: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 82 KiB/s rd, 0 B/s wr, 136 op/s
Nov 29 06:51:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:16 compute-2 sudo[236242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:16 compute-2 sudo[236242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:16 compute-2 sudo[236242]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:16 compute-2 sudo[236267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:16 compute-2 sudo[236267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:16 compute-2 sudo[236267]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:16.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:16.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:18 compute-2 ceph-mon[77142]: pgmap v1165: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Nov 29 06:51:18 compute-2 sudo[236294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:18 compute-2 sudo[236294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:18 compute-2 sudo[236294]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:18.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:18 compute-2 sudo[236319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:51:18 compute-2 sudo[236319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:18 compute-2 sudo[236319]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:18 compute-2 sudo[236344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:18 compute-2 sudo[236344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:18 compute-2 sudo[236344]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:18 compute-2 sudo[236369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:51:18 compute-2 sudo[236369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:18.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:19 compute-2 sudo[236369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:19 compute-2 sudo[236414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:19 compute-2 sudo[236414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:19 compute-2 sudo[236414]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:19 compute-2 sudo[236439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:51:19 compute-2 sudo[236439]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:19 compute-2 sudo[236439]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:19 compute-2 sudo[236464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:19 compute-2 sudo[236464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:19 compute-2 sudo[236464]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:19 compute-2 sudo[236489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:51:19 compute-2 sudo[236489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:20 compute-2 sudo[236489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:20 compute-2 sudo[236546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:20 compute-2 sudo[236546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:20 compute-2 sudo[236546]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:20 compute-2 sudo[236571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:51:20 compute-2 sudo[236571]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:20 compute-2 sudo[236571]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:20 compute-2 sudo[236596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:20 compute-2 sudo[236596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:20 compute-2 sudo[236596]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:20 compute-2 sudo[236621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- inventory --format=json-pretty --filter-for-batch
Nov 29 06:51:20 compute-2 sudo[236621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:20 compute-2 ceph-mon[77142]: pgmap v1166: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 28 KiB/s rd, 0 B/s wr, 46 op/s
Nov 29 06:51:20 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:20 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:20 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 06:51:20 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 06:51:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:20.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:20 compute-2 podman[236687]: 2025-11-29 06:51:20.772471242 +0000 UTC m=+0.042598593 container create d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:51:20 compute-2 systemd[1]: Started libpod-conmon-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope.
Nov 29 06:51:20 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:51:20 compute-2 podman[236687]: 2025-11-29 06:51:20.752657691 +0000 UTC m=+0.022785062 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:51:20 compute-2 podman[236687]: 2025-11-29 06:51:20.865017714 +0000 UTC m=+0.135145075 container init d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:51:20 compute-2 podman[236687]: 2025-11-29 06:51:20.877279522 +0000 UTC m=+0.147406863 container start d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 06:51:20 compute-2 podman[236687]: 2025-11-29 06:51:20.880379636 +0000 UTC m=+0.150507007 container attach d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:51:20 compute-2 eloquent_neumann[236703]: 167 167
Nov 29 06:51:20 compute-2 systemd[1]: libpod-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope: Deactivated successfully.
Nov 29 06:51:20 compute-2 conmon[236703]: conmon d7f10d0da8f49b7c2494 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope/container/memory.events
Nov 29 06:51:20 compute-2 podman[236687]: 2025-11-29 06:51:20.888608086 +0000 UTC m=+0.158735437 container died d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:51:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:20.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:20 compute-2 systemd[1]: var-lib-containers-storage-overlay-4626f7b5634d1712807b4095bac8b1ff8d2f4f77119ab23bc2082f8a0c21e217-merged.mount: Deactivated successfully.
Nov 29 06:51:20 compute-2 podman[236687]: 2025-11-29 06:51:20.934018874 +0000 UTC m=+0.204146215 container remove d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:51:20 compute-2 systemd[1]: libpod-conmon-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope: Deactivated successfully.
Nov 29 06:51:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:21 compute-2 podman[236730]: 2025-11-29 06:51:21.116525417 +0000 UTC m=+0.047293099 container create a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:51:21 compute-2 systemd[1]: Started libpod-conmon-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope.
Nov 29 06:51:21 compute-2 systemd[1]: Started libcrun container.
Nov 29 06:51:21 compute-2 podman[236730]: 2025-11-29 06:51:21.099596453 +0000 UTC m=+0.030364155 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:51:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:51:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:51:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:51:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:51:21 compute-2 podman[236730]: 2025-11-29 06:51:21.21511785 +0000 UTC m=+0.145885592 container init a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 06:51:21 compute-2 podman[236730]: 2025-11-29 06:51:21.221179043 +0000 UTC m=+0.151946725 container start a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 06:51:21 compute-2 podman[236730]: 2025-11-29 06:51:21.228674784 +0000 UTC m=+0.159442566 container attach a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Nov 29 06:51:21 compute-2 ceph-mon[77142]: pgmap v1167: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 7.8 KiB/s rd, 0 B/s wr, 13 op/s
Nov 29 06:51:22 compute-2 nova_compute[231979]: 2025-11-29 06:51:22.233 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:22 compute-2 nova_compute[231979]: 2025-11-29 06:51:22.234 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]: [
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:     {
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "available": false,
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "ceph_device": false,
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "lsm_data": {},
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "lvs": [],
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "path": "/dev/sr0",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "rejected_reasons": [
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "Insufficient space (<5GB)",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "Has a FileSystem"
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         ],
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         "sys_api": {
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "actuators": null,
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "device_nodes": "sr0",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "devname": "sr0",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "human_readable_size": "482.00 KB",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "id_bus": "ata",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "model": "QEMU DVD-ROM",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "nr_requests": "2",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "parent": "/dev/sr0",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "partitions": {},
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "path": "/dev/sr0",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "removable": "1",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "rev": "2.5+",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "ro": "0",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "rotational": "1",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "sas_address": "",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "sas_device_handle": "",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "scheduler_mode": "mq-deadline",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "sectors": 0,
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "sectorsize": "2048",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "size": 493568.0,
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "support_discard": "2048",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "type": "disk",
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:             "vendor": "QEMU"
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:         }
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]:     }
Nov 29 06:51:22 compute-2 flamboyant_matsumoto[236747]: ]
Nov 29 06:51:22 compute-2 systemd[1]: libpod-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope: Deactivated successfully.
Nov 29 06:51:22 compute-2 systemd[1]: libpod-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope: Consumed 1.157s CPU time.
Nov 29 06:51:22 compute-2 podman[236730]: 2025-11-29 06:51:22.371158805 +0000 UTC m=+1.301926507 container died a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:51:22 compute-2 systemd[1]: var-lib-containers-storage-overlay-463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488-merged.mount: Deactivated successfully.
Nov 29 06:51:22 compute-2 podman[236730]: 2025-11-29 06:51:22.431433041 +0000 UTC m=+1.362200723 container remove a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:51:22 compute-2 systemd[1]: libpod-conmon-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope: Deactivated successfully.
Nov 29 06:51:22 compute-2 sudo[236621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:22.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:22.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:51:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:51:23 compute-2 ceph-mon[77142]: pgmap v1168: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 7.8 KiB/s rd, 0 B/s wr, 13 op/s
Nov 29 06:51:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:24.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:25 compute-2 ceph-mon[77142]: pgmap v1169: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:25 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:26.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:26.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:28 compute-2 nova_compute[231979]: 2025-11-29 06:51:28.275 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.58 sec
Nov 29 06:51:28 compute-2 ceph-mon[77142]: pgmap v1170: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.005000134s ======
Nov 29 06:51:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:28.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000134s
Nov 29 06:51:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:28.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:29 compute-2 nova_compute[231979]: 2025-11-29 06:51:29.385 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:29 compute-2 nova_compute[231979]: 2025-11-29 06:51:29.385 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:51:29 compute-2 nova_compute[231979]: 2025-11-29 06:51:29.385 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:51:29 compute-2 ceph-mon[77142]: pgmap v1171: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:51:30 compute-2 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:30.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:30.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:30 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.080 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.081 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.081 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.081 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.082 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:51:31 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2305942895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.552 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.710 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.712 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5255MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.712 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:31 compute-2 nova_compute[231979]: 2025-11-29 06:51:31.712 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:32 compute-2 ceph-mon[77142]: pgmap v1172: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:32 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/990427680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:32 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2305942895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:32 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/763194160' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:32.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:32.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:32 compute-2 nova_compute[231979]: 2025-11-29 06:51:32.994 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:51:32 compute-2 nova_compute[231979]: 2025-11-29 06:51:32.994 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:51:33 compute-2 nova_compute[231979]: 2025-11-29 06:51:33.070 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:33 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:51:33 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1735230033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:33 compute-2 nova_compute[231979]: 2025-11-29 06:51:33.511 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:33 compute-2 nova_compute[231979]: 2025-11-29 06:51:33.518 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:51:33 compute-2 ceph-mon[77142]: pgmap v1173: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:34.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:34.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:35 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3831186218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:35 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1735230033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:35 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/4133065214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:35 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:36 compute-2 sudo[237997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:36 compute-2 sudo[237997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:36 compute-2 sudo[237997]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:36 compute-2 sudo[238022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:51:36 compute-2 sudo[238022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:36 compute-2 sudo[238022]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:36 compute-2 sudo[238046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:36 compute-2 sudo[238046]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:36 compute-2 sudo[238046]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:36 compute-2 ceph-mon[77142]: pgmap v1174: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:36 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:36 compute-2 sudo[238072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:36 compute-2 sudo[238072]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:36 compute-2 sudo[238072]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:36.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:36.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:37 compute-2 ceph-mon[77142]: pgmap v1175: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:38.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:38.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:40 compute-2 nova_compute[231979]: 2025-11-29 06:51:40.527 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:51:40 compute-2 nova_compute[231979]: 2025-11-29 06:51:40.528 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:51:40 compute-2 nova_compute[231979]: 2025-11-29 06:51:40.529 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 8.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:40 compute-2 ceph-mon[77142]: pgmap v1176: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:40.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:40 compute-2 podman[238101]: 2025-11-29 06:51:40.918067538 +0000 UTC m=+0.071006875 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:51:40 compute-2 podman[238102]: 2025-11-29 06:51:40.932571837 +0000 UTC m=+0.090075496 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 06:51:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:40.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:40 compute-2 podman[238100]: 2025-11-29 06:51:40.959063647 +0000 UTC m=+0.108794748 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 06:51:40 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:42 compute-2 ceph-mon[77142]: pgmap v1177: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:42.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:42.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:44 compute-2 ceph-mon[77142]: pgmap v1178: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:44.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:44.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:45 compute-2 ceph-mon[77142]: pgmap v1179: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:45 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:46.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:46.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:47 compute-2 ceph-mon[77142]: pgmap v1180: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:48.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:48.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:49 compute-2 ceph-mon[77142]: pgmap v1181: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:50.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:50.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:52 compute-2 ceph-mon[77142]: pgmap v1182: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:52.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:52.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:53 compute-2 ceph-mon[77142]: pgmap v1183: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:54.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:54.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:56 compute-2 nova_compute[231979]: 2025-11-29 06:51:56.044 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.77 sec
Nov 29 06:51:56 compute-2 ceph-mon[77142]: pgmap v1184: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:56 compute-2 sudo[238171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:56 compute-2 sudo[238171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:56 compute-2 sudo[238171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:56 compute-2 sudo[238196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:56 compute-2 sudo[238196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:56 compute-2 sudo[238196]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:56.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:57 compute-2 ceph-mon[77142]: pgmap v1185: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:58.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:51:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:58.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:00 compute-2 ceph-mon[77142]: pgmap v1186: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:00.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:00.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:02 compute-2 ceph-mon[77142]: pgmap v1187: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:52:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:52:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:52:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:52:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:02.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:02.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:03 compute-2 sshd-session[238225]: Connection closed by 192.161.163.11 port 19886
Nov 29 06:52:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:52:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:52:04 compute-2 ceph-mon[77142]: pgmap v1188: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:04.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:04.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:05 compute-2 sshd-session[238226]: Connection closed by authenticating user root 192.161.163.11 port 20072 [preauth]
Nov 29 06:52:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:06 compute-2 ceph-mon[77142]: pgmap v1189: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:06.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:06.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:07 compute-2 ceph-mon[77142]: pgmap v1190: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:08.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:08.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:09 compute-2 ceph-mon[77142]: pgmap v1191: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:10.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:11.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:11 compute-2 podman[238233]: 2025-11-29 06:52:11.890621933 +0000 UTC m=+0.050884783 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 06:52:11 compute-2 podman[238234]: 2025-11-29 06:52:11.901663428 +0000 UTC m=+0.058501536 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:52:11 compute-2 podman[238232]: 2025-11-29 06:52:11.919819464 +0000 UTC m=+0.082505469 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:52:12 compute-2 ceph-mon[77142]: pgmap v1192: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:12.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:13.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:14 compute-2 ceph-mon[77142]: pgmap v1193: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:14.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:15.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:52:15.146 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:52:15.147 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:52:15.147 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:15 compute-2 ceph-mon[77142]: pgmap v1194: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:16 compute-2 sudo[238297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:16 compute-2 sudo[238297]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:16 compute-2 sudo[238297]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:16 compute-2 sudo[238322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:16 compute-2 sudo[238322]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:16 compute-2 sudo[238322]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:16.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:17.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:17 compute-2 ceph-mon[77142]: pgmap v1195: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:18.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:19.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:20 compute-2 ceph-mon[77142]: pgmap v1196: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:20.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:21.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:21 compute-2 ceph-mon[77142]: pgmap v1197: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:22.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:23.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:23 compute-2 ceph-mon[77142]: pgmap v1198: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:24.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:25.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:25 compute-2 ceph-mon[77142]: pgmap v1199: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:26.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:27.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.502552) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147502605, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 251, "total_data_size": 6203043, "memory_usage": 6281232, "flush_reason": "Manual Compaction"}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 29 06:52:27 compute-2 ceph-mon[77142]: pgmap v1200: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147527147, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 4024087, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21119, "largest_seqno": 23472, "table_properties": {"data_size": 4014455, "index_size": 6126, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19000, "raw_average_key_size": 20, "raw_value_size": 3995455, "raw_average_value_size": 4214, "num_data_blocks": 274, "num_entries": 948, "num_filter_entries": 948, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398925, "oldest_key_time": 1764398925, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 24643 microseconds, and 11073 cpu microseconds.
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527195) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 4024087 bytes OK
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527217) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.530960) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.530992) EVENT_LOG_v1 {"time_micros": 1764399147530985, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.531010) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 6192706, prev total WAL file size 6192706, number of live WAL files 2.
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.532423) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(3929KB)], [42(7197KB)]
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147532479, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 11394557, "oldest_snapshot_seqno": -1}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5132 keys, 9349035 bytes, temperature: kUnknown
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147605240, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9349035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9314272, "index_size": 20829, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 128878, "raw_average_key_size": 25, "raw_value_size": 9220817, "raw_average_value_size": 1796, "num_data_blocks": 857, "num_entries": 5132, "num_filter_entries": 5132, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.605481) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9349035 bytes
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.607177) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.4 rd, 128.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 7.0 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.2) write-amplify(2.3) OK, records in: 5651, records dropped: 519 output_compression: NoCompression
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.607199) EVENT_LOG_v1 {"time_micros": 1764399147607189, "job": 24, "event": "compaction_finished", "compaction_time_micros": 72833, "compaction_time_cpu_micros": 22100, "output_level": 6, "num_output_files": 1, "total_output_size": 9349035, "num_input_records": 5651, "num_output_records": 5132, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147608165, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147609760, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.532367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:28.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:28 compute-2 sshd-session[238352]: Invalid user administrator from 92.118.39.92 port 57616
Nov 29 06:52:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:29.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:29 compute-2 sshd-session[238352]: Connection closed by invalid user administrator 92.118.39.92 port 57616 [preauth]
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.166898) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149166932, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 262, "num_deletes": 256, "total_data_size": 20532, "memory_usage": 27256, "flush_reason": "Manual Compaction"}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149168602, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 13142, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23474, "largest_seqno": 23734, "table_properties": {"data_size": 11326, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4242, "raw_average_key_size": 16, "raw_value_size": 7889, "raw_average_value_size": 30, "num_data_blocks": 2, "num_entries": 261, "num_filter_entries": 261, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 1785 microseconds, and 640 cpu microseconds.
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.168682) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 13142 bytes OK
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.168734) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.169783) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.169833) EVENT_LOG_v1 {"time_micros": 1764399149169826, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.169851) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 18466, prev total WAL file size 18466, number of live WAL files 2.
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170369) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323534' seq:72057594037927935, type:22 .. '6C6F676D00353036' seq:0, type:0; will stop at (end)
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(12KB)], [45(9129KB)]
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149170401, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 9362177, "oldest_snapshot_seqno": -1}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4877 keys, 9228475 bytes, temperature: kUnknown
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149233288, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 9228475, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9194913, "index_size": 20268, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 124775, "raw_average_key_size": 25, "raw_value_size": 9105384, "raw_average_value_size": 1867, "num_data_blocks": 828, "num_entries": 4877, "num_filter_entries": 4877, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.233519) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 9228475 bytes
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.234944) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 148.7 rd, 146.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 8.9 +0.0 blob) out(8.8 +0.0 blob), read-write-amplify(1414.6) write-amplify(702.2) OK, records in: 5393, records dropped: 516 output_compression: NoCompression
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.234966) EVENT_LOG_v1 {"time_micros": 1764399149234956, "job": 26, "event": "compaction_finished", "compaction_time_micros": 62960, "compaction_time_cpu_micros": 18764, "output_level": 6, "num_output_files": 1, "total_output_size": 9228475, "num_input_records": 5393, "num_output_records": 4877, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149235083, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149236972, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170310) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:30 compute-2 ceph-mon[77142]: pgmap v1201: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:30.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:31.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:31 compute-2 ceph-mon[77142]: pgmap v1202: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:32.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:33.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:34 compute-2 ceph-mon[77142]: pgmap v1203: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:34.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:35.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:35 compute-2 ceph-mon[77142]: pgmap v1204: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:36 compute-2 sudo[238358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:36 compute-2 sudo[238358]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-2 sudo[238358]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:36 compute-2 sudo[238383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:52:36 compute-2 sudo[238383]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-2 sudo[238383]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:36 compute-2 sudo[238409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:36 compute-2 sudo[238409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-2 sudo[238409]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:36 compute-2 sudo[238434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:52:36 compute-2 sudo[238434]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:36.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:36 compute-2 sudo[238497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:36 compute-2 sudo[238497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-2 sudo[238497]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:36 compute-2 sudo[238537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:36 compute-2 sudo[238537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-2 sudo[238537]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:37.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:37 compute-2 podman[238583]: 2025-11-29 06:52:37.062633199 +0000 UTC m=+0.089104065 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Nov 29 06:52:37 compute-2 podman[238583]: 2025-11-29 06:52:37.17106268 +0000 UTC m=+0.197533536 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Nov 29 06:52:37 compute-2 podman[238733]: 2025-11-29 06:52:37.867660309 +0000 UTC m=+0.052782383 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:52:37 compute-2 podman[238733]: 2025-11-29 06:52:37.878105238 +0000 UTC m=+0.063227282 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 06:52:38 compute-2 podman[238797]: 2025-11-29 06:52:38.063444187 +0000 UTC m=+0.047766199 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, build-date=2023-02-22T09:23:20, architecture=x86_64, release=1793, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.28.2, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, vcs-type=git, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 06:52:38 compute-2 podman[238797]: 2025-11-29 06:52:38.078220213 +0000 UTC m=+0.062542245 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, vendor=Red Hat, Inc., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=keepalived, description=keepalived for Ceph)
Nov 29 06:52:38 compute-2 sudo[238434]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:38 compute-2 ceph-mon[77142]: pgmap v1205: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:38 compute-2 sudo[238831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:38 compute-2 sudo[238831]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:38 compute-2 sudo[238831]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:38 compute-2 sudo[238857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:52:38 compute-2 sudo[238857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:38 compute-2 sudo[238857]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:38 compute-2 sudo[238882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:38 compute-2 sudo[238882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:38 compute-2 sudo[238882]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:38 compute-2 sudo[238907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:52:38 compute-2 sudo[238907]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:39.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:39 compute-2 sudo[238907]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:39 compute-2 ceph-mon[77142]: pgmap v1206: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:52:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:52:39 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:52:40 compute-2 nova_compute[231979]: 2025-11-29 06:52:40.531 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:40 compute-2 nova_compute[231979]: 2025-11-29 06:52:40.531 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:40 compute-2 nova_compute[231979]: 2025-11-29 06:52:40.531 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:52:40 compute-2 nova_compute[231979]: 2025-11-29 06:52:40.532 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:52:40 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:40 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:52:40 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:52:40 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:52:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:40.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:41.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:42 compute-2 ceph-mon[77142]: pgmap v1207: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:42.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:42 compute-2 podman[238967]: 2025-11-29 06:52:42.914612057 +0000 UTC m=+0.063543182 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:52:42 compute-2 podman[238966]: 2025-11-29 06:52:42.935030833 +0000 UTC m=+0.089322771 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:52:42 compute-2 podman[238965]: 2025-11-29 06:52:42.94054894 +0000 UTC m=+0.094268813 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 29 06:52:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:43.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:44 compute-2 ceph-mon[77142]: pgmap v1208: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:44.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:45.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:46 compute-2 ceph-mon[77142]: pgmap v1209: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:46 compute-2 sudo[239028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:46 compute-2 sudo[239028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:46 compute-2 sudo[239028]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:46.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:46 compute-2 sudo[239053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:52:46 compute-2 sudo[239053]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:46 compute-2 sudo[239053]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:47.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:47 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:47 compute-2 ceph-mon[77142]: pgmap v1210: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:48.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:49.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:49 compute-2 ceph-mon[77142]: pgmap v1211: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:50.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:51.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:52 compute-2 ceph-mon[77142]: pgmap v1212: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:52.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:53.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:53 compute-2 ceph-mon[77142]: pgmap v1213: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:54.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:55.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:56 compute-2 ceph-mon[77142]: pgmap v1214: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:56.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:57 compute-2 sudo[239083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:57 compute-2 sudo[239083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:57 compute-2 sudo[239083]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:57.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:57 compute-2 sudo[239108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:57 compute-2 sudo[239108]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:57 compute-2 sudo[239108]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:58 compute-2 ceph-mon[77142]: pgmap v1215: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:58.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:52:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:59.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:59 compute-2 ceph-mon[77142]: pgmap v1216: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:00.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:01.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:02 compute-2 ceph-mon[77142]: pgmap v1217: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:02.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:03.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:03 compute-2 ceph-mon[77142]: pgmap v1218: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:04.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:05.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:06 compute-2 ceph-mon[77142]: pgmap v1219: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:06.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:07.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:08 compute-2 ceph-mon[77142]: pgmap v1220: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:08.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:09.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:10 compute-2 ceph-mon[77142]: pgmap v1221: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:10.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:11.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:11 compute-2 ceph-mon[77142]: pgmap v1222: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:12.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:13.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:13 compute-2 podman[239142]: 2025-11-29 06:53:13.925725466 +0000 UTC m=+0.076093977 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:53:13 compute-2 podman[239143]: 2025-11-29 06:53:13.938349854 +0000 UTC m=+0.078792839 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd)
Nov 29 06:53:13 compute-2 podman[239141]: 2025-11-29 06:53:13.96359792 +0000 UTC m=+0.111626238 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:53:14 compute-2 ceph-mon[77142]: pgmap v1223: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:14.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:15.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:53:15.147 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:53:15.148 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:53:15.148 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:16 compute-2 ceph-mon[77142]: pgmap v1224: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:16.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:17.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:17 compute-2 sudo[239203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:17 compute-2 sudo[239203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:17 compute-2 sudo[239203]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:17 compute-2 sudo[239228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:17 compute-2 sudo[239228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:17 compute-2 sudo[239228]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.472 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.473 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.473 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.473 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.475 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:18.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:18 compute-2 nova_compute[231979]: 2025-11-29 06:53:18.923 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 52.88 sec
Nov 29 06:53:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:19.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:19 compute-2 ceph-mon[77142]: pgmap v1225: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:20.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:21 compute-2 ceph-mon[77142]: pgmap v1226: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:21.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:22 compute-2 ceph-mon[77142]: pgmap v1227: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:22.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:23.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.181 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.181 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.182 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.182 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.183 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:53:23 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2185145600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.598 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.764 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.766 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5280MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.766 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:23 compute-2 nova_compute[231979]: 2025-11-29 06:53:23.766 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:24 compute-2 ceph-mon[77142]: pgmap v1228: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:24.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:25.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:26 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2185145600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:26 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/586851354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:26 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2060863307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:26 compute-2 ceph-mon[77142]: pgmap v1229: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:26.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:27.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:27 compute-2 ceph-mon[77142]: pgmap v1230: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:28.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:29.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:29 compute-2 nova_compute[231979]: 2025-11-29 06:53:29.173 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.25 sec
Nov 29 06:53:30 compute-2 ceph-mon[77142]: pgmap v1231: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:30.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:31.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:32 compute-2 ceph-mon[77142]: pgmap v1232: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:32.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:33.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:33 compute-2 ceph-mon[77142]: pgmap v1233: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:34.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:35.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:36 compute-2 ceph-mon[77142]: pgmap v1234: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:36.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:37.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:37 compute-2 sudo[239285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:37 compute-2 sudo[239285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:37 compute-2 sudo[239285]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:37 compute-2 sudo[239310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:37 compute-2 sudo[239310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:37 compute-2 sudo[239310]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:38 compute-2 ceph-mon[77142]: pgmap v1235: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:38 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1865317670' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:53:38 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1865317670' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:53:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:38.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:39.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:40 compute-2 ceph-mon[77142]: pgmap v1236: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:40 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1260406766' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:53:40 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1260406766' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:53:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:40.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:41.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:41 compute-2 ceph-mon[77142]: pgmap v1237: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:42.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:43.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:44 compute-2 ceph-mon[77142]: pgmap v1238: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:44.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:44 compute-2 podman[239341]: 2025-11-29 06:53:44.894996535 +0000 UTC m=+0.053534513 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:53:44 compute-2 podman[239340]: 2025-11-29 06:53:44.909972896 +0000 UTC m=+0.064546868 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:53:44 compute-2 podman[239339]: 2025-11-29 06:53:44.92360315 +0000 UTC m=+0.089473805 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:53:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:45.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:45 compute-2 ceph-mon[77142]: pgmap v1239: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:46.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:46 compute-2 sudo[239401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:46 compute-2 sudo[239401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:46 compute-2 sudo[239401]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:46 compute-2 sudo[239426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:53:47 compute-2 sudo[239426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:47 compute-2 sudo[239426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:47 compute-2 sudo[239451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:47 compute-2 sudo[239451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:47 compute-2 sudo[239451]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:47 compute-2 sudo[239476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:53:47 compute-2 sudo[239476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:47.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:47 compute-2 ceph-mon[77142]: pgmap v1240: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:47 compute-2 sudo[239476]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:48.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:49 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:53:49 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:53:49 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:53:49 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:53:49 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:53:49 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:53:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:49.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:50 compute-2 ceph-mon[77142]: pgmap v1241: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:50.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:51 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:51.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:52 compute-2 nova_compute[231979]: 2025-11-29 06:53:51.999 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:53:52 compute-2 nova_compute[231979]: 2025-11-29 06:53:51.999 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:53:52 compute-2 nova_compute[231979]: 2025-11-29 06:53:52.045 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:52 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:53:52 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3731896802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:52 compute-2 nova_compute[231979]: 2025-11-29 06:53:52.480 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:52 compute-2 nova_compute[231979]: 2025-11-29 06:53:52.487 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:53:52 compute-2 ceph-mon[77142]: pgmap v1242: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:52.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:53.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:54 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3731896802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:54 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3803659993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:54 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/185560197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:54 compute-2 ceph-mon[77142]: pgmap v1243: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:54 compute-2 nova_compute[231979]: 2025-11-29 06:53:54.475 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:53:54 compute-2 nova_compute[231979]: 2025-11-29 06:53:54.479 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:53:54 compute-2 nova_compute[231979]: 2025-11-29 06:53:54.480 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 30.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:54 compute-2 nova_compute[231979]: 2025-11-29 06:53:54.481 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:54 compute-2 nova_compute[231979]: 2025-11-29 06:53:54.481 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:53:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:54.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:55.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:56 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:56 compute-2 ceph-mon[77142]: pgmap v1244: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:56.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:57.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:57 compute-2 sudo[239560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:57 compute-2 sudo[239560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:57 compute-2 sudo[239560]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:57 compute-2 sudo[239585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:57 compute-2 sudo[239585]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:57 compute-2 sudo[239585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:58 compute-2 ceph-mon[77142]: pgmap v1245: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:58.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:59 compute-2 sudo[239611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:59 compute-2 sudo[239611]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:59 compute-2 sudo[239611]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:53:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:59.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:59 compute-2 sudo[239636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:53:59 compute-2 sudo[239636]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:59 compute-2 sudo[239636]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:59 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:53:59 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:53:59 compute-2 ceph-mon[77142]: pgmap v1246: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:00.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:01 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:01.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:02 compute-2 ceph-mon[77142]: pgmap v1247: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:02 compute-2 nova_compute[231979]: 2025-11-29 06:54:02.642 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 13.47 sec
Nov 29 06:54:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:02.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:03 compute-2 nova_compute[231979]: 2025-11-29 06:54:03.047 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:54:03 compute-2 nova_compute[231979]: 2025-11-29 06:54:03.048 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:03 compute-2 nova_compute[231979]: 2025-11-29 06:54:03.048 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:54:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:03.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:54:03 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:54:04 compute-2 ceph-mon[77142]: pgmap v1248: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:04 compute-2 nova_compute[231979]: 2025-11-29 06:54:04.485 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:04.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:05.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:05 compute-2 ceph-mon[77142]: pgmap v1249: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:06 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:06.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:07.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:08 compute-2 ceph-mon[77142]: pgmap v1250: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:08.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:09.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:10 compute-2 ceph-mon[77142]: pgmap v1251: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:11.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:11 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:12 compute-2 ceph-mon[77142]: pgmap v1252: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:13.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:13 compute-2 ceph-mon[77142]: pgmap v1253: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:14.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:54:15.148 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:54:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:54:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:15.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:15 compute-2 podman[239671]: 2025-11-29 06:54:15.89367084 +0000 UTC m=+0.054742320 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 06:54:15 compute-2 podman[239670]: 2025-11-29 06:54:15.905703433 +0000 UTC m=+0.067002159 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 06:54:15 compute-2 podman[239669]: 2025-11-29 06:54:15.945251753 +0000 UTC m=+0.111924403 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 29 06:54:16 compute-2 ceph-mon[77142]: pgmap v1254: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:16 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:16.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:17.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:17 compute-2 ceph-mon[77142]: pgmap v1255: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:17 compute-2 sudo[239732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:17 compute-2 sudo[239732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:17 compute-2 sudo[239732]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:17 compute-2 sudo[239757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:17 compute-2 sudo[239757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:17 compute-2 sudo[239757]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:18.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:19.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:19 compute-2 ceph-mon[77142]: pgmap v1256: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:19 compute-2 nova_compute[231979]: 2025-11-29 06:54:19.737 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:19 compute-2 nova_compute[231979]: 2025-11-29 06:54:19.737 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:20.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:21.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:21 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:22 compute-2 ceph-mon[77142]: pgmap v1257: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:22.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:23.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:23 compute-2 ceph-mon[77142]: pgmap v1258: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:24.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:25.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:25 compute-2 ceph-mon[77142]: pgmap v1259: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:26.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:27.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:27 compute-2 nova_compute[231979]: 2025-11-29 06:54:27.876 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:27 compute-2 nova_compute[231979]: 2025-11-29 06:54:27.877 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:54:27 compute-2 nova_compute[231979]: 2025-11-29 06:54:27.877 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:54:28 compute-2 ceph-mon[77142]: pgmap v1260: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:28 compute-2 nova_compute[231979]: 2025-11-29 06:54:28.894 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.25 sec
Nov 29 06:54:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:28.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:29.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:29 compute-2 ceph-mon[77142]: pgmap v1261: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:30.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:31.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:31 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:32 compute-2 ceph-mon[77142]: pgmap v1262: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:32 compute-2 nova_compute[231979]: 2025-11-29 06:54:32.725 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:54:32 compute-2 nova_compute[231979]: 2025-11-29 06:54:32.725 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:32 compute-2 nova_compute[231979]: 2025-11-29 06:54:32.725 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:32 compute-2 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:32 compute-2 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:32 compute-2 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:32 compute-2 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:32.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:33.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:33 compute-2 ceph-mon[77142]: pgmap v1263: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:34.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:35.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:35 compute-2 ceph-mon[77142]: pgmap v1264: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:54:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:54:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:54:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:54:36 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:54:36 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:54:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:36.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:37 compute-2 sudo[239792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:37 compute-2 sudo[239792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:38 compute-2 sudo[239792]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:38 compute-2 sudo[239817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:38 compute-2 sudo[239817]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:38 compute-2 sudo[239817]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:38 compute-2 ceph-mon[77142]: pgmap v1265: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:38.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:39.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:40 compute-2 ceph-mon[77142]: pgmap v1266: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:40.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:41.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:41 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:41 compute-2 sshd-session[239844]: Connection closed by authenticating user root 92.118.39.92 port 51038 [preauth]
Nov 29 06:54:41 compute-2 ceph-mon[77142]: pgmap v1267: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:42.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:43.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:44 compute-2 ceph-mon[77142]: pgmap v1268: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:44.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:45.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:45 compute-2 ceph-mon[77142]: pgmap v1269: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:46 compute-2 podman[239850]: 2025-11-29 06:54:46.886646976 +0000 UTC m=+0.047875315 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 29 06:54:46 compute-2 podman[239851]: 2025-11-29 06:54:46.895019701 +0000 UTC m=+0.054693818 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 06:54:46 compute-2 podman[239849]: 2025-11-29 06:54:46.917644898 +0000 UTC m=+0.079708889 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 06:54:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:46.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:47.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:47 compute-2 ceph-mon[77142]: pgmap v1270: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:48.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:48 compute-2 nova_compute[231979]: 2025-11-29 06:54:48.954 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:48 compute-2 nova_compute[231979]: 2025-11-29 06:54:48.954 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:54:48 compute-2 nova_compute[231979]: 2025-11-29 06:54:48.954 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:49 compute-2 nova_compute[231979]: 2025-11-29 06:54:49.154 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.26 sec
Nov 29 06:54:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:49.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:50 compute-2 ceph-mon[77142]: pgmap v1271: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:50.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:51.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:52.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:53.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:53 compute-2 ceph-mon[77142]: pgmap v1272: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:54 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:54 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:54 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:54.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:55 compute-2 ceph-mon[77142]: pgmap v1273: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:56 compute-2 ceph-mon[77142]: pgmap v1274: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:56 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:56 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:56 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:56.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:57.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:57 compute-2 ceph-mon[77142]: pgmap v1275: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:58 compute-2 sudo[239919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:58 compute-2 sudo[239919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:58 compute-2 sudo[239919]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:58 compute-2 sudo[239944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:58 compute-2 sudo[239944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:58 compute-2 sudo[239944]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:58 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:58 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:58 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:58.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:54:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:59.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:59 compute-2 sudo[239970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:59 compute-2 sudo[239970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-2 sudo[239970]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:59 compute-2 sudo[239995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:54:59 compute-2 sudo[239995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-2 sudo[239995]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:59 compute-2 sudo[240020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:59 compute-2 sudo[240020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-2 sudo[240020]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:59 compute-2 sudo[240045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:54:59 compute-2 sudo[240045]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-2 sudo[240045]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:00 compute-2 ceph-mon[77142]: pgmap v1276: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:00 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:00 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:00 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:00.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:01.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:01 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:55:01 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:55:01 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:55:01 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:55:01 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:55:01 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.246 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.247 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.247 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.247 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.248 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:55:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2982005399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.775 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:02 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:02 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:02 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:02.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.949 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.951 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5261MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.951 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:02 compute-2 nova_compute[231979]: 2025-11-29 06:55:02.951 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:03.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:03 compute-2 ceph-mon[77142]: pgmap v1277: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2982005399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:04 compute-2 ceph-mon[77142]: pgmap v1278: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:04 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:04 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:04 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:04.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:05.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:05 compute-2 ceph-mon[77142]: pgmap v1279: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:06 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:06 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:06 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:06.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:07.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:07 compute-2 ceph-mon[77142]: pgmap v1280: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:08 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:08 compute-2 nova_compute[231979]: 2025-11-29 06:55:08.324 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 9.17 sec
Nov 29 06:55:08 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:08 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:08 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:08.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:09.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:10 compute-2 ceph-mon[77142]: pgmap v1281: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:10 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:10 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:10 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:10.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:11.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:12 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:12 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:12 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:12.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:13 compute-2 ceph-mon[77142]: pgmap v1282: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:13 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:13.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:14 compute-2 ceph-mon[77142]: pgmap v1283: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:14 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:14 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:14 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:14.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:55:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:55:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:55:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:15.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.359 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.360 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.390 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:15 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:55:15 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3008667807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.795 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:15 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2860548019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:15 compute-2 ceph-mon[77142]: pgmap v1284: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.800 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.902 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.904 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:55:15 compute-2 nova_compute[231979]: 2025-11-29 06:55:15.904 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 12.953s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:16 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:16 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:16 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:16.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:17 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3008667807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:17 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2736452741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:17 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/595463311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:17 compute-2 podman[240155]: 2025-11-29 06:55:17.912356942 +0000 UTC m=+0.066633918 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:55:17 compute-2 podman[240156]: 2025-11-29 06:55:17.976686988 +0000 UTC m=+0.119271570 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 06:55:17 compute-2 podman[240154]: 2025-11-29 06:55:17.99353429 +0000 UTC m=+0.150698893 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:55:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:18 compute-2 sudo[240217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:18 compute-2 sudo[240217]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:18 compute-2 sudo[240217]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:18 compute-2 sudo[240242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:18 compute-2 sudo[240242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:18 compute-2 sudo[240242]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:18 compute-2 ceph-mon[77142]: pgmap v1285: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:18 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/570431191' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:18 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:18 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:18 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:18.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:19.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:19 compute-2 ceph-mon[77142]: pgmap v1286: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:20 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:20 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:20 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:20.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:21 compute-2 sudo[240269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:21 compute-2 sudo[240269]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:21 compute-2 sudo[240269]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:21 compute-2 sudo[240294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:55:21 compute-2 sudo[240294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:21 compute-2 sudo[240294]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:21.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:21 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:55:21 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:55:21 compute-2 ceph-mon[77142]: pgmap v1287: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:22 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:22 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:22 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:22.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:23.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:23 compute-2 ceph-mon[77142]: pgmap v1288: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:24 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:24 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:24 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:24.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:25.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:26 compute-2 ceph-mon[77142]: pgmap v1289: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:26 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:26 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:26 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:26.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:27.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:28 compute-2 ceph-mon[77142]: pgmap v1290: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:28 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:28 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:28 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:28.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:29.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:29 compute-2 ceph-mon[77142]: pgmap v1291: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:30 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:30 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:30 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:30.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:31.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:32 compute-2 ceph-mon[77142]: pgmap v1292: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:32 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:32 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:32 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:32.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:33 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:33.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:33 compute-2 ceph-mon[77142]: pgmap v1293: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:34 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:34 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:34 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:34.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:35.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:35 compute-2 ceph-mon[77142]: pgmap v1294: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:55:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:55:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:55:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:55:36 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:36 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:36 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:36.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:37 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:55:37 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:55:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:37.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:38 compute-2 ceph-mon[77142]: pgmap v1295: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:38 compute-2 sudo[240328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:38 compute-2 sudo[240328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:38 compute-2 sudo[240328]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:38 compute-2 sudo[240353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:38 compute-2 sudo[240353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:38 compute-2 sudo[240353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:38 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:38 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:38 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:38.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:39.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:39 compute-2 ceph-mon[77142]: pgmap v1296: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:40 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:40 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:40 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:40.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:41.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:42 compute-2 ceph-mon[77142]: pgmap v1297: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:42 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:42 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:42 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:42.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:43.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:43 compute-2 ceph-mon[77142]: pgmap v1298: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:44 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:44 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:44 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:44.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:45.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:46 compute-2 ceph-mon[77142]: pgmap v1299: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:46 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:46 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:46 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:46.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:47.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:47 compute-2 ceph-mon[77142]: pgmap v1300: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:48 compute-2 podman[240384]: 2025-11-29 06:55:48.918527894 +0000 UTC m=+0.060882114 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:55:48 compute-2 podman[240385]: 2025-11-29 06:55:48.927420443 +0000 UTC m=+0.062778085 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 29 06:55:48 compute-2 podman[240383]: 2025-11-29 06:55:48.943222657 +0000 UTC m=+0.086059970 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:55:48 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:48 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:48 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:48.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:49.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:50 compute-2 ceph-mon[77142]: pgmap v1301: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:50 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:50 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:50 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:50.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:51.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:51 compute-2 ceph-mon[77142]: pgmap v1302: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:52 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:52 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:52 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:52.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:53.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:54 compute-2 ceph-mon[77142]: pgmap v1303: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:54.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:55.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:56 compute-2 ceph-mon[77142]: pgmap v1304: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:57.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:57.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:57 compute-2 ceph-mon[77142]: pgmap v1305: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:58 compute-2 sudo[240448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:58 compute-2 sudo[240448]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:58 compute-2 sudo[240448]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:58 compute-2 sudo[240473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:58 compute-2 sudo[240473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:58 compute-2 sudo[240473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:59.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:55:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:59.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:00 compute-2 ceph-mon[77142]: pgmap v1306: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:01.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:01.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:02 compute-2 ceph-mon[77142]: pgmap v1307: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:03.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:03.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:04 compute-2 ceph-mon[77142]: pgmap v1308: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:05.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:05.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:05 compute-2 ceph-mon[77142]: pgmap v1309: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:07.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:08 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:08 compute-2 ceph-mon[77142]: pgmap v1310: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:09.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:09.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:11 compute-2 ceph-mon[77142]: pgmap v1311: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:11.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:11.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:12 compute-2 ceph-mon[77142]: pgmap v1312: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:13.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:13 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:13.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:14 compute-2 ceph-mon[77142]: pgmap v1313: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:15.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:56:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:56:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:56:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:15.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:15 compute-2 ceph-mon[77142]: pgmap v1314: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:15 compute-2 nova_compute[231979]: 2025-11-29 06:56:15.906 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:15 compute-2 nova_compute[231979]: 2025-11-29 06:56:15.907 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:17.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:17.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:17 compute-2 nova_compute[231979]: 2025-11-29 06:56:17.901 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:17 compute-2 nova_compute[231979]: 2025-11-29 06:56:17.901 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:56:17 compute-2 nova_compute[231979]: 2025-11-29 06:56:17.901 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:56:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.291 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-2 ceph-mon[77142]: pgmap v1315: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.497 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.498 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.498 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.498 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.499 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:56:18 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/217955619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:18 compute-2 nova_compute[231979]: 2025-11-29 06:56:18.938 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:19.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:19 compute-2 sudo[240530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:19 compute-2 sudo[240530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:19 compute-2 sudo[240530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:19 compute-2 sudo[240573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:19 compute-2 sudo[240573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:19 compute-2 sudo[240573]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:19 compute-2 podman[240556]: 2025-11-29 06:56:19.095836676 +0000 UTC m=+0.056466840 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 06:56:19 compute-2 podman[240554]: 2025-11-29 06:56:19.114859187 +0000 UTC m=+0.082608502 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 06:56:19 compute-2 podman[240555]: 2025-11-29 06:56:19.12089451 +0000 UTC m=+0.085708836 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.147 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.149 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5275MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.149 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.149 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.380444) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379380508, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2336, "num_deletes": 251, "total_data_size": 5965364, "memory_usage": 6051520, "flush_reason": "Manual Compaction"}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 29 06:56:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:19.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:19 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/217955619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:19 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3654944032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:19 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1431858114' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379404572, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3916609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23739, "largest_seqno": 26070, "table_properties": {"data_size": 3907165, "index_size": 6002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18818, "raw_average_key_size": 20, "raw_value_size": 3888452, "raw_average_value_size": 4154, "num_data_blocks": 268, "num_entries": 936, "num_filter_entries": 936, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 24154 microseconds, and 11347 cpu microseconds.
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.404618) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3916609 bytes OK
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.404639) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.405952) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.405967) EVENT_LOG_v1 {"time_micros": 1764399379405962, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.405982) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5955169, prev total WAL file size 5955169, number of live WAL files 2.
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407096) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3824KB)], [48(9012KB)]
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379407140, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 13145084, "oldest_snapshot_seqno": -1}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5296 keys, 11154011 bytes, temperature: kUnknown
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379488409, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 11154011, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11116174, "index_size": 23519, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13253, "raw_key_size": 134023, "raw_average_key_size": 25, "raw_value_size": 11017725, "raw_average_value_size": 2080, "num_data_blocks": 968, "num_entries": 5296, "num_filter_entries": 5296, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.488637) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 11154011 bytes
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.490200) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.6 rd, 137.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 8.8 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 5813, records dropped: 517 output_compression: NoCompression
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.490219) EVENT_LOG_v1 {"time_micros": 1764399379490209, "job": 28, "event": "compaction_finished", "compaction_time_micros": 81329, "compaction_time_cpu_micros": 26075, "output_level": 6, "num_output_files": 1, "total_output_size": 11154011, "num_input_records": 5813, "num_output_records": 5296, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379491033, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379492854, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-2 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.658 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.659 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.677 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing inventories for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.718 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating ProviderTree inventory for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.719 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.734 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing aggregate associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.751 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing trait associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:56:19 compute-2 nova_compute[231979]: 2025-11-29 06:56:19.977 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:20 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:56:20 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2239183731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:20 compute-2 ceph-mon[77142]: pgmap v1316: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:20 compute-2 nova_compute[231979]: 2025-11-29 06:56:20.426 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:20 compute-2 nova_compute[231979]: 2025-11-29 06:56:20.431 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:56:20 compute-2 nova_compute[231979]: 2025-11-29 06:56:20.529 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:56:20 compute-2 nova_compute[231979]: 2025-11-29 06:56:20.530 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:56:20 compute-2 nova_compute[231979]: 2025-11-29 06:56:20.531 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:21.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:21 compute-2 sudo[240665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:21 compute-2 sudo[240665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-2 sudo[240665]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:21 compute-2 sudo[240690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:56:21 compute-2 sudo[240690]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-2 sudo[240690]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:21.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:21 compute-2 sudo[240715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:21 compute-2 sudo[240715]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-2 sudo[240715]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:21 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2239183731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:21 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3108214857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:21 compute-2 ceph-mon[77142]: pgmap v1317: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:21 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/886911526' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:21 compute-2 sudo[240740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:56:21 compute-2 sudo[240740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-2 sudo[240740]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:56:22 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:56:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:23.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:23.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:56:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:56:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:56:23 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:56:23 compute-2 ceph-mon[77142]: pgmap v1318: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:25.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:25.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:26 compute-2 ceph-mon[77142]: pgmap v1319: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:27.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:27.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:28 compute-2 ceph-mon[77142]: pgmap v1320: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:29.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:29.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:29 compute-2 ceph-mon[77142]: pgmap v1321: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:31.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:31.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:31 compute-2 sudo[240803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:31 compute-2 sudo[240803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:31 compute-2 sudo[240803]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:31 compute-2 sudo[240828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:56:31 compute-2 sudo[240828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:31 compute-2 sudo[240828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:33.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:56:33 compute-2 ceph-mon[77142]: pgmap v1322: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:56:33 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:33.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:35.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:35 compute-2 ceph-mon[77142]: pgmap v1323: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:35.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:36 compute-2 ceph-mon[77142]: pgmap v1324: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:37.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:37.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:39.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:39 compute-2 sudo[240857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:39 compute-2 sudo[240857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:39 compute-2 sudo[240857]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:39 compute-2 sudo[240882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:39 compute-2 sudo[240882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:39 compute-2 sudo[240882]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:39.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:40 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:56:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:41.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:41.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:41 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:56:41 compute-2 ceph-mon[77142]: pgmap v1325: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:41 compute-2 ceph-mon[77142]: pgmap v1326: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:43 compute-2 ceph-mon[77142]: pgmap v1327: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:43.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:43.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:44 compute-2 ceph-mon[77142]: pgmap v1328: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:45.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:45.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:46 compute-2 ceph-mon[77142]: pgmap v1329: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:47.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:47.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:47 compute-2 ceph-mon[77142]: pgmap v1330: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:49.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:49.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:49 compute-2 podman[240913]: 2025-11-29 06:56:49.924595009 +0000 UTC m=+0.081920924 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 06:56:49 compute-2 podman[240914]: 2025-11-29 06:56:49.935852242 +0000 UTC m=+0.081531874 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 06:56:49 compute-2 podman[240912]: 2025-11-29 06:56:49.95773593 +0000 UTC m=+0.117473650 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:56:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:51.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:51.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:53.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:53.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:54 compute-2 sshd-session[240976]: Invalid user testuser from 92.118.39.92 port 44468
Nov 29 06:56:54 compute-2 sshd-session[240976]: Connection closed by invalid user testuser 92.118.39.92 port 44468 [preauth]
Nov 29 06:56:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:55.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:55.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:56 compute-2 ceph-mon[77142]: pgmap v1331: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:57.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:57.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:58 compute-2 ceph-mon[77142]: pgmap v1332: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:58 compute-2 ceph-mon[77142]: pgmap v1333: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:58 compute-2 ceph-mon[77142]: pgmap v1334: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:59.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:59 compute-2 sudo[240981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:59 compute-2 sudo[240981]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:59 compute-2 sudo[240981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:59 compute-2 sudo[241006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:59 compute-2 sudo[241006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:59 compute-2 sudo[241006]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:56:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:59 compute-2 ceph-mon[77142]: pgmap v1335: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:01 compute-2 ceph-mon[77142]: pgmap v1336: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:01.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:01.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:02 compute-2 ceph-mon[77142]: pgmap v1337: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:03.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:03.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:04 compute-2 ceph-mon[77142]: pgmap v1338: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:05.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:05.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:05 compute-2 ceph-mon[77142]: pgmap v1339: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:07.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:07.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:07 compute-2 ceph-mon[77142]: pgmap v1340: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:08 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:09.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:09.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:10 compute-2 ceph-mon[77142]: pgmap v1341: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:11.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:11.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:12 compute-2 ceph-mon[77142]: pgmap v1342: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:13.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:13 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:13.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:13 compute-2 ceph-mon[77142]: pgmap v1343: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:15.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:57:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:57:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:57:15.151 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:15.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:16 compute-2 ceph-mon[77142]: pgmap v1344: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:17.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:17.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:18 compute-2 ceph-mon[77142]: pgmap v1345: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:19.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:19.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:19 compute-2 sudo[241041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:19 compute-2 sudo[241041]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:19 compute-2 sudo[241041]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:19 compute-2 sudo[241066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:19 compute-2 sudo[241066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:19 compute-2 sudo[241066]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:19 compute-2 ceph-mon[77142]: pgmap v1346: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:20 compute-2 nova_compute[231979]: 2025-11-29 06:57:20.533 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:20 compute-2 nova_compute[231979]: 2025-11-29 06:57:20.533 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:20 compute-2 podman[241094]: 2025-11-29 06:57:20.904233712 +0000 UTC m=+0.057156978 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 29 06:57:20 compute-2 podman[241092]: 2025-11-29 06:57:20.918743582 +0000 UTC m=+0.078025939 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:57:20 compute-2 podman[241093]: 2025-11-29 06:57:20.927314363 +0000 UTC m=+0.080933088 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 06:57:20 compute-2 nova_compute[231979]: 2025-11-29 06:57:20.930 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:20 compute-2 nova_compute[231979]: 2025-11-29 06:57:20.931 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:57:20 compute-2 nova_compute[231979]: 2025-11-29 06:57:20.931 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:57:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:21.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:21.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:21 compute-2 ceph-mon[77142]: pgmap v1347: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:23.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:23.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:24 compute-2 ceph-mon[77142]: pgmap v1348: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:25.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:25.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:25 compute-2 ceph-mon[77142]: pgmap v1349: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.762 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.762 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.762 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:57:26 compute-2 nova_compute[231979]: 2025-11-29 06:57:26.764 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:27.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.173 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.174 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.174 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.175 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.175 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:27.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:27 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:57:27 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2389936592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.798 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.623s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:27 compute-2 ceph-mon[77142]: pgmap v1350: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.942 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.943 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5279MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.943 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:27 compute-2 nova_compute[231979]: 2025-11-29 06:57:27.943 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.312 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.312 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:57:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.397 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:57:28 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1141100869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.826 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.832 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.942 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.944 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:57:28 compute-2 nova_compute[231979]: 2025-11-29 06:57:28.944 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2616126904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1769306037' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2389936592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3901645275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/526359832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1141100869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:29.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:29.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:30 compute-2 ceph-mon[77142]: pgmap v1351: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:31.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:31.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:31 compute-2 sudo[241203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:31 compute-2 sudo[241203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:31 compute-2 sudo[241203]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:31 compute-2 sudo[241228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:57:31 compute-2 sudo[241228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:31 compute-2 sudo[241228]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:32 compute-2 sudo[241253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:32 compute-2 sudo[241253]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:32 compute-2 sudo[241253]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:32 compute-2 sudo[241278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:57:32 compute-2 sudo[241278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:32 compute-2 ceph-mon[77142]: pgmap v1352: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:32 compute-2 sudo[241278]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:33.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:33 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:33.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:57:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:57:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:57:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:57:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:57:33 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:57:34 compute-2 ceph-mon[77142]: pgmap v1353: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:35.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:35.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:35 compute-2 ceph-mon[77142]: pgmap v1354: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:57:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:57:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:57:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:57:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:37.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:37.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:38 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:57:38 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:57:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:39.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:39.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:39 compute-2 sudo[241338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:39 compute-2 sudo[241338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:39 compute-2 sudo[241338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:39 compute-2 sudo[241363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:39 compute-2 sudo[241363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:39 compute-2 sudo[241363]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:40 compute-2 ceph-mon[77142]: pgmap v1355: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:41.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:41 compute-2 ceph-mon[77142]: pgmap v1356: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:41.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:43.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:43.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:45.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:45 compute-2 ceph-mon[77142]: pgmap v1357: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:45.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:47.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:47.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:47 compute-2 ceph-mon[77142]: pgmap v1358: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:47 compute-2 ceph-mon[77142]: pgmap v1359: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:49.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:49 compute-2 ceph-mon[77142]: pgmap v1360: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:49.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:51.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:51.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:51 compute-2 podman[241395]: 2025-11-29 06:57:51.915004273 +0000 UTC m=+0.065799100 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:57:51 compute-2 podman[241396]: 2025-11-29 06:57:51.923130342 +0000 UTC m=+0.072016368 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:57:51 compute-2 podman[241394]: 2025-11-29 06:57:51.943752617 +0000 UTC m=+0.097836013 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 06:57:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:53.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:53 compute-2 ceph-mon[77142]: pgmap v1361: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:54 compute-2 ceph-mon[77142]: pgmap v1362: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:54 compute-2 ceph-mon[77142]: pgmap v1363: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:55.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:55.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:56 compute-2 ceph-mon[77142]: pgmap v1364: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:56 compute-2 sudo[241459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:56 compute-2 sudo[241459]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:56 compute-2 sudo[241459]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:57 compute-2 sudo[241484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:57:57 compute-2 sudo[241484]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:57 compute-2 sudo[241484]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:57.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:57.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:57:57 compute-2 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:57:57 compute-2 ceph-mon[77142]: pgmap v1365: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:59.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:57:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:59.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:59 compute-2 sudo[241510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:59 compute-2 sudo[241510]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:59 compute-2 sudo[241510]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:59 compute-2 sudo[241535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:59 compute-2 sudo[241535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:59 compute-2 sudo[241535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:01.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:01 compute-2 ceph-mon[77142]: pgmap v1366: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:01.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:02 compute-2 ceph-mon[77142]: pgmap v1367: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:03.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:03.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:03 compute-2 ceph-mon[77142]: pgmap v1368: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:05.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:05.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:07.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:07 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:07 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:07 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:07.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:08 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:09.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:09 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:09 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:09 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:09.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:10 compute-2 ceph-mon[77142]: pgmap v1369: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:11.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:11 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:11 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:11 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:11.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:12 compute-2 ceph-mon[77142]: pgmap v1370: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:12 compute-2 ceph-mon[77142]: pgmap v1371: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:12 compute-2 ceph-mon[77142]: pgmap v1372: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:13.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:13 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:13 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:13 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:13 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:13.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:15.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:58:15.152 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:58:15.152 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:15 compute-2 ovn_metadata_agent[143380]: 2025-11-29 06:58:15.152 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:15 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:15 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:15 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:15.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:16 compute-2 ceph-mon[77142]: pgmap v1373: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:16 compute-2 ceph-mon[77142]: pgmap v1374: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:17.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:17 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:17 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:17 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:17.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:18 compute-2 ceph-mon[77142]: pgmap v1375: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:18 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:19.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:19 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:19 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:19 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:19.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:19 compute-2 sudo[241570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:19 compute-2 sudo[241570]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:19 compute-2 sudo[241570]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:20 compute-2 sudo[241595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:20 compute-2 sudo[241595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:20 compute-2 sudo[241595]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:20 compute-2 ceph-mon[77142]: pgmap v1376: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:21.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:21 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:21 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:21 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:21.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:22 compute-2 ceph-mon[77142]: pgmap v1377: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:22 compute-2 podman[241624]: 2025-11-29 06:58:22.928647607 +0000 UTC m=+0.078924687 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:58:22 compute-2 podman[241623]: 2025-11-29 06:58:22.958590548 +0000 UTC m=+0.110154043 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 06:58:22 compute-2 podman[241622]: 2025-11-29 06:58:22.95866568 +0000 UTC m=+0.110880913 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:58:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:23.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:23 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:23 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:23 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:23 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:23.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.862 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.941 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.941 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:58:23 compute-2 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:24 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/705562816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:24 compute-2 ceph-mon[77142]: pgmap v1378: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.137 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.137 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.138 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.139 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.139 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:24 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:58:24 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2010625727' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.616 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.813 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.814 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5283MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.815 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:24 compute-2 nova_compute[231979]: 2025-11-29 06:58:24.815 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:25.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:25 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3131628032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:25 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2010625727' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:25 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:25 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:25 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:25.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:25 compute-2 nova_compute[231979]: 2025-11-29 06:58:25.921 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:58:25 compute-2 nova_compute[231979]: 2025-11-29 06:58:25.922 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:58:25 compute-2 nova_compute[231979]: 2025-11-29 06:58:25.941 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:26 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:58:26 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3947950218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:26 compute-2 nova_compute[231979]: 2025-11-29 06:58:26.396 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:26 compute-2 nova_compute[231979]: 2025-11-29 06:58:26.404 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:58:26 compute-2 nova_compute[231979]: 2025-11-29 06:58:26.481 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:58:26 compute-2 nova_compute[231979]: 2025-11-29 06:58:26.483 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:58:26 compute-2 nova_compute[231979]: 2025-11-29 06:58:26.483 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:26 compute-2 nova_compute[231979]: 2025-11-29 06:58:26.484 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:26 compute-2 ceph-mon[77142]: pgmap v1379: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:26 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2138341124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:27.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:27 compute-2 nova_compute[231979]: 2025-11-29 06:58:27.502 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:27 compute-2 nova_compute[231979]: 2025-11-29 06:58:27.503 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:27 compute-2 nova_compute[231979]: 2025-11-29 06:58:27.503 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:27 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:27 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:27 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:27.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:27 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3947950218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:27 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2021454943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:27 compute-2 ceph-mon[77142]: pgmap v1380: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:28 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:29.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:29 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:29 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:29 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:29.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:30 compute-2 ceph-mon[77142]: pgmap v1381: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:31 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:31 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:31 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:31.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:32 compute-2 ceph-mon[77142]: pgmap v1382: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:32 compute-2 nova_compute[231979]: 2025-11-29 06:58:32.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:32 compute-2 nova_compute[231979]: 2025-11-29 06:58:32.861 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:58:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:33.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:33 compute-2 nova_compute[231979]: 2025-11-29 06:58:33.165 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:58:33 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:33 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:33 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:33 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:33.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:33 compute-2 ceph-mon[77142]: pgmap v1383: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:35.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:35 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:35 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:35 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:35.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:36 compute-2 sshd-session[241734]: Accepted publickey for zuul from 192.168.122.10 port 45926 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:58:36 compute-2 systemd-logind[784]: New session 52 of user zuul.
Nov 29 06:58:36 compute-2 systemd[1]: Started Session 52 of User zuul.
Nov 29 06:58:36 compute-2 sshd-session[241734]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:58:36 compute-2 sudo[241738]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 29 06:58:36 compute-2 sudo[241738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:58:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:58:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:58:36 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:58:36 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:58:37 compute-2 ceph-mon[77142]: pgmap v1384: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:37.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:37 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:37 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:37 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:37.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:38 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:58:38 compute-2 ceph-mon[77142]: from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:58:38 compute-2 ceph-mon[77142]: from='client.24755 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:38 compute-2 ceph-mon[77142]: pgmap v1385: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:38 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:39.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:39 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:39 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:39 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:39.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:39 compute-2 ceph-mon[77142]: from='client.24761 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:39 compute-2 ceph-mon[77142]: from='client.14961 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:39 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1573460643' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:39 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1977521267' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:39 compute-2 nova_compute[231979]: 2025-11-29 06:58:39.862 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:39 compute-2 nova_compute[231979]: 2025-11-29 06:58:39.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:58:39 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Nov 29 06:58:39 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/857796520' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:40 compute-2 sudo[241991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:40 compute-2 sudo[241991]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:40 compute-2 sudo[241991]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:40 compute-2 sudo[242016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:40 compute-2 sudo[242016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:40 compute-2 sudo[242016]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:41.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:41 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:41 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:41 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:41.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:42 compute-2 ceph-mon[77142]: from='client.14967 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:42 compute-2 ceph-mon[77142]: from='client.24820 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:42 compute-2 ceph-mon[77142]: pgmap v1386: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:42 compute-2 ceph-mon[77142]: from='client.24826 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:42 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/857796520' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:43.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:43 compute-2 ceph-mon[77142]: pgmap v1387: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:43 compute-2 ovs-vsctl[242120]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 29 06:58:43 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:43 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:43 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:43.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:43 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:44 compute-2 virtqemud[231501]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 29 06:58:44 compute-2 virtqemud[231501]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 29 06:58:44 compute-2 virtqemud[231501]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 29 06:58:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:45.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:45 compute-2 ceph-mon[77142]: pgmap v1388: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:45 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:45 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:45 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:45.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:45 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: cache status {prefix=cache status} (starting...)
Nov 29 06:58:45 compute-2 lvm[242444]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:58:45 compute-2 lvm[242444]: VG ceph_vg0 finished
Nov 29 06:58:45 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: client ls {prefix=client ls} (starting...)
Nov 29 06:58:46 compute-2 ceph-mon[77142]: pgmap v1389: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:46 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: damage ls {prefix=damage ls} (starting...)
Nov 29 06:58:46 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump loads {prefix=dump loads} (starting...)
Nov 29 06:58:46 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Nov 29 06:58:46 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2741286316' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:46 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 29 06:58:46 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 29 06:58:46 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 29 06:58:47 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 29 06:58:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:47.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Nov 29 06:58:47 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1529673451' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:47 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 29 06:58:47 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 29 06:58:47 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:47 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:47 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:47.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Nov 29 06:58:47 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4200116878' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:47 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: ops {prefix=ops} (starting...)
Nov 29 06:58:47 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Nov 29 06:58:47 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3706317250' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Nov 29 06:58:48 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/352369704' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 06:58:48 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/446556486' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: session ls {prefix=session ls} (starting...)
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.24776 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1276981983' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.24835 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.24841 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/135313544' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2741286316' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mon[77142]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:48 compute-2 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: status {prefix=status} (starting...)
Nov 29 06:58:48 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:49.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 06:58:49 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2498518273' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Nov 29 06:58:49 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2393270283' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:49 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:49 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:49.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.24853 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.24803 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.14982 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: pgmap v1390: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1029441042' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1529673451' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3853464868' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.24880 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/4200116878' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1898843982' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/961495441' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3706317250' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/352369704' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.24836 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2721476524' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/446556486' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.14994 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.24910 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.24848 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.24916 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: pgmap v1391: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2498518273' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/684679155' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3910457812' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2393270283' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.15018 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1704439386' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 06:58:49 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/258398947' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:49 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Nov 29 06:58:49 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2584480393' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:58:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 06:58:50 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3995147939' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:50 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Nov 29 06:58:50 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1377123572' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:58:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:51.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:51 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:51 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:51 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:51.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000186 1 0.000250
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000009 0 0.000000
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 129 handle_osd_map epochs [127,129], i have 129, src has [1,129]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:49.684203+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 804350 data_alloc: 218103808 data_used: 270336
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 129 heartbeat osd_stat(store_statfs(0x1bceaa000/0x0/0x1bfc00000, data 0xc453d/0x183000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:50.684426+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 scrub-queue::remove_from_osd_queue removing pg[9.1d] failed. State was: not registered w/ OSD
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 129 heartbeat osd_stat(store_statfs(0x1bceaa000/0x0/0x1bfc00000, data 0xc453d/0x183000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:51.684624+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:52.684835+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:53.685022+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 0'0 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 5.068034 8 0.000145
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 0'0 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 0'0 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 scrub-queue::remove_from_osd_queue removing pg[9.1d] failed. State was: not registered w/ OSD
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.013292 4 0.000222
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000087 1 0.000116
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.216606 1 0.000085
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:54.685184+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 818484 data_alloc: 218103808 data_used: 270336
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.497914314s of 10.499240875s, submitted: 21
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.604952 1 0.000083
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.835138 0 0.000000
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started 5.903227 0 0.000000
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] enter Reset
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] exit Reset 0.000147 1 0.000220
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Start
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started/Primary
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.005875 2 0.000435
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: merge_log_dups log.dups.size()=0olog.dups.size()=36
Nov 29 06:58:51 compute-2 ceph-osd[79822]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=36
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000746 2 0.000106
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:55.685331+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [132,133], i have 133, src has [1,133]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.495460 2 0.000095
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.502198 0 0.000000
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005318 4 0.000260
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000030 0 0.000000
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:56.685491+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 133 heartbeat osd_stat(store_statfs(0x1bca8e000/0x0/0x1bfc00000, data 0xc992a/0x18e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 133 heartbeat osd_stat(store_statfs(0x1bca8c000/0x0/0x1bfc00000, data 0xcb456/0x191000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:57.685747+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 1179648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:58.685883+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 1179648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:24:59.686036+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827726 data_alloc: 218103808 data_used: 270336
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:00.686182+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:01.686358+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:02.686545+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_client  log_queue is 2 last_log 165 sent 163 num 2 unsent 2 sending 2
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_client  will send 2025-11-29T06:25:32.248327+0000 osd.2 (osd.2) 164 : cluster [DBG] 9.1d scrub starts
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_client  will send 2025-11-29T06:25:32.280016+0000 osd.2 (osd.2) 165 : cluster [DBG] 9.1d scrub ok
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:03.686765+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:04.686931+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827322 data_alloc: 218103808 data_used: 270336
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:05.687064+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:06.687215+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:07.687367+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:08.687554+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 1146880 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:09.687691+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827322 data_alloc: 218103808 data_used: 270336
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.387766838s of 15.429096222s, submitted: 16
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:10.688072+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 135 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:11.688213+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_client handle_log_ack log(last 165) v1
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_client  logged 2025-11-29T06:25:32.248327+0000 osd.2 (osd.2) 164 : cluster [DBG] 9.1d scrub starts
Nov 29 06:58:51 compute-2 ceph-osd[79822]: log_client  logged 2025-11-29T06:25:32.280016+0000 osd.2 (osd.2) 165 : cluster [DBG] 9.1d scrub ok
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:12.688389+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:13.688550+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 135 heartbeat osd_stat(store_statfs(0x1bca86000/0x0/0x1bfc00000, data 0xcebbe/0x197000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:14.688691+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831496 data_alloc: 218103808 data_used: 278528
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:15.688846+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 1196032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:16.688993+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 1196032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:17.689152+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 136 heartbeat osd_stat(store_statfs(0x1bca83000/0x0/0x1bfc00000, data 0xd0851/0x19a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:18.689392+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:19.689572+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834470 data_alloc: 218103808 data_used: 278528
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:20.689725+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 1179648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.889656067s of 10.965355873s, submitted: 7
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:21.689877+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 137 heartbeat osd_stat(store_statfs(0x1bca83000/0x0/0x1bfc00000, data 0xd0851/0x19a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 137 heartbeat osd_stat(store_statfs(0x1bca80000/0x0/0x1bfc00000, data 0xd2379/0x19d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:22.690084+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:23.690257+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:24.690487+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837444 data_alloc: 218103808 data_used: 278528
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:25.690687+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 137 heartbeat osd_stat(store_statfs(0x1bca80000/0x0/0x1bfc00000, data 0xd2379/0x19d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:26.690903+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:27.691056+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 1146880 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:28.691187+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 1146880 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:29.691319+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840418 data_alloc: 218103808 data_used: 278528
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 1138688 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:30.691506+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 1138688 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:31.691731+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 1130496 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:32.691895+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 1130496 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:33.692074+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 1130496 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.057117462s of 13.097633362s, submitted: 6
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:34.692245+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 1122304 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:35.692416+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 1122304 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:36.692583+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 1114112 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:37.692783+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 1114112 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:38.692990+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 1105920 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:39.693257+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 1105920 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:40.693400+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 1105920 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:41.693560+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:42.693758+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:43.693963+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:44.694184+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:45.694348+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:46.694503+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:47.694684+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:48.694883+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:49.695089+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:50.695299+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 1081344 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:51.695448+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 1081344 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:52.695693+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 1073152 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:53.695914+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 1073152 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:54.696093+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 1073152 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:55.696260+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 1064960 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:56.696411+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 1064960 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:57.696565+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 1064960 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:58.696745+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68091904 unmapped: 1056768 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:59.696905+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 1048576 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:00.697112+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 1040384 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:01.697293+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 1040384 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:02.697728+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:03.697891+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:04.698039+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:05.698229+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:06.698406+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 1024000 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:07.698570+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 1024000 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:08.698672+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 1024000 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:09.699017+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 1015808 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:10.699169+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 1007616 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:11.699321+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 1007616 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:12.699536+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 1007616 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:13.699718+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 999424 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:14.699898+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 999424 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:15.700137+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 991232 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:16.700339+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 991232 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:17.700550+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68165632 unmapped: 983040 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:18.700767+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 974848 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:19.700893+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 974848 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:20.701049+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 966656 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:21.701205+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 966656 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:22.701423+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 966656 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:23.701567+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 958464 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:24.701732+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 958464 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:25.701888+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 958464 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:26.702107+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 950272 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:27.702285+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 950272 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:28.702436+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 942080 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:29.702567+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 942080 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:30.702883+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 942080 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:31.703088+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 933888 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:32.703270+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 933888 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:33.703452+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 933888 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:34.703626+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 925696 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:35.703862+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 925696 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:36.704103+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68231168 unmapped: 917504 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:37.704358+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68231168 unmapped: 917504 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:38.704575+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68231168 unmapped: 917504 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:39.704795+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 909312 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:40.705014+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 909312 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:41.705170+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 909312 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:42.705387+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 901120 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:43.705540+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 901120 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:44.705713+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:45.705910+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:46.706109+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:47.706307+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:48.706509+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 884736 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:49.706678+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 884736 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:50.706894+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 876544 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:51.707074+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 876544 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:52.707297+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 876544 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:53.707450+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 868352 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:54.707617+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 868352 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:55.707795+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 860160 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:56.707979+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 860160 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:57.708137+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 851968 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:58.708296+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 851968 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:59.708459+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 851968 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:00.708627+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68304896 unmapped: 843776 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:01.708790+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68304896 unmapped: 843776 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:02.709055+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68304896 unmapped: 843776 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:03.709270+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68313088 unmapped: 835584 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:04.709481+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68313088 unmapped: 835584 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:05.709653+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68321280 unmapped: 827392 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:06.709888+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68321280 unmapped: 827392 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:07.710051+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68321280 unmapped: 827392 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:08.710208+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 819200 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:09.710412+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 819200 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:10.710565+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 819200 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:11.710854+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68337664 unmapped: 811008 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:12.711100+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68337664 unmapped: 811008 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:13.711299+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68337664 unmapped: 811008 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:14.711447+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68345856 unmapped: 802816 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:15.711659+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68345856 unmapped: 802816 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:16.711846+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 794624 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:17.712091+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 794624 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:18.712250+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 794624 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:19.712378+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68362240 unmapped: 786432 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:20.712630+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68362240 unmapped: 786432 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:21.712905+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68362240 unmapped: 786432 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:22.713298+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68370432 unmapped: 778240 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:23.713528+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68370432 unmapped: 778240 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:24.713800+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:25.714063+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:26.714246+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:27.714445+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:28.714673+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 761856 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:29.714914+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 761856 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:30.715231+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:31.715396+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:32.715609+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 745472 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:33.715793+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:34.716018+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:35.716229+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:36.716433+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:37.716619+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:38.716770+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 745472 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:39.716939+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 745472 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:40.717152+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68411392 unmapped: 737280 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:41.717329+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68411392 unmapped: 737280 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:42.717512+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68411392 unmapped: 737280 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:43.717776+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68419584 unmapped: 729088 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:44.718015+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68419584 unmapped: 729088 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:45.718206+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68419584 unmapped: 729088 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:46.718360+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68427776 unmapped: 720896 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:47.718519+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68427776 unmapped: 720896 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:48.718740+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68427776 unmapped: 720896 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:49.718910+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68435968 unmapped: 712704 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:50.719085+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68435968 unmapped: 712704 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:51.719342+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68435968 unmapped: 712704 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:52.719544+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 704512 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:53.719689+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 704512 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:54.719878+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68452352 unmapped: 696320 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:55.720036+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68452352 unmapped: 696320 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:56.720213+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 688128 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:57.720413+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 688128 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:58.720617+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 688128 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:59.720865+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 671744 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:00.721069+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 671744 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:01.721262+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 671744 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:02.721553+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68485120 unmapped: 663552 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:03.721793+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68493312 unmapped: 655360 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:04.722016+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68501504 unmapped: 647168 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:05.722209+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68501504 unmapped: 647168 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:06.722424+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68501504 unmapped: 647168 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:07.722870+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68509696 unmapped: 638976 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:08.723218+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68509696 unmapped: 638976 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:09.723446+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68509696 unmapped: 638976 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:10.723663+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 630784 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:11.724006+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 630784 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:12.724350+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 622592 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:13.725057+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 622592 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:15.435379+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 622592 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:16.435525+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 614400 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:17.435687+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 614400 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:18.435960+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 606208 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:19.436347+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 606208 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:20.436623+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 606208 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:21.436905+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 598016 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:22.437283+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 598016 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:23.437495+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 589824 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:24.437684+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 589824 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:25.437893+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 589824 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:26.438126+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68567040 unmapped: 581632 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:27.438433+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68567040 unmapped: 581632 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:28.438754+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68567040 unmapped: 581632 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:29.438957+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68583424 unmapped: 565248 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:30.439153+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68583424 unmapped: 565248 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:31.439414+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68591616 unmapped: 557056 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:32.439608+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68591616 unmapped: 557056 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:33.439797+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:34.440043+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:35.440219+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:36.440397+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:37.440573+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68608000 unmapped: 540672 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:38.440882+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68608000 unmapped: 540672 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:39.441110+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68608000 unmapped: 540672 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:40.441369+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68616192 unmapped: 532480 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:41.441556+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68616192 unmapped: 532480 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:42.441747+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 524288 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:43.441950+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 524288 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:44.442212+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 524288 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:45.442402+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68632576 unmapped: 516096 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:46.442636+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68632576 unmapped: 516096 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:47.442898+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68640768 unmapped: 507904 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:48.443089+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68640768 unmapped: 507904 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:49.443249+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68648960 unmapped: 499712 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:50.443559+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68657152 unmapped: 491520 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:51.443799+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68657152 unmapped: 491520 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:52.444024+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:53.444223+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:54.444353+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68657152 unmapped: 491520 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:55.444518+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:56.444715+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:57.444858+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:58.444995+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 475136 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:59.445121+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 475136 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:00.445309+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 475136 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:01.445473+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68681728 unmapped: 466944 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:02.445606+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68681728 unmapped: 466944 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:03.445879+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68689920 unmapped: 458752 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:04.446091+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68689920 unmapped: 458752 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:05.446267+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68689920 unmapped: 458752 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:06.446406+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 450560 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:07.446546+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 450560 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:08.446780+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68706304 unmapped: 442368 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:09.447046+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68706304 unmapped: 442368 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:10.447251+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 434176 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:11.447442+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 434176 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:12.447653+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 434176 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:13.447992+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68722688 unmapped: 425984 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:14.448209+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68722688 unmapped: 425984 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:15.448412+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68722688 unmapped: 425984 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:16.448595+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68730880 unmapped: 417792 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:17.448850+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68730880 unmapped: 417792 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:18.449034+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 18.68 MB, 0.03 MB/s
                                           Interval WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:19.449160+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:20.449349+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:21.449593+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:22.449771+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 344064 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:23.450018+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 344064 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:24.450191+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68812800 unmapped: 335872 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:25.450356+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68812800 unmapped: 335872 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:26.450532+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68812800 unmapped: 335872 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:27.450668+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68820992 unmapped: 327680 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:28.450823+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68820992 unmapped: 327680 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:29.450984+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68829184 unmapped: 319488 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:30.451208+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68829184 unmapped: 319488 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:31.451407+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68837376 unmapped: 311296 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:32.451582+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:33.451822+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:34.452033+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:35.452165+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:36.452334+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:37.452505+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:38.452649+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:39.452876+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:40.453035+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68861952 unmapped: 286720 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:41.453175+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68861952 unmapped: 286720 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:42.453325+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68870144 unmapped: 278528 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:43.453648+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68870144 unmapped: 278528 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:44.454070+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68878336 unmapped: 270336 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:45.454524+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68878336 unmapped: 270336 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:46.454886+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68878336 unmapped: 270336 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:47.455049+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68886528 unmapped: 262144 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:48.455325+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68886528 unmapped: 262144 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:49.455516+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 253952 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:50.455674+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 253952 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:51.455797+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 253952 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:52.455948+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:53.456108+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:54.456283+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:55.456453+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:56.456643+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 237568 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:57.456797+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 237568 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:58.456994+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 229376 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:59.457128+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 229376 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:00.457265+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 229376 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:01.457476+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 221184 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:02.457681+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 221184 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:03.457976+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 221184 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:04.458127+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68935680 unmapped: 212992 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:05.458357+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68935680 unmapped: 212992 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:06.458538+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68943872 unmapped: 204800 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:07.458690+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68943872 unmapped: 204800 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:08.458920+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68943872 unmapped: 204800 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:09.459100+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 196608 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:10.459257+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 196608 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:11.459400+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 188416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:12.459590+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 188416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:13.459796+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 180224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:14.459952+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 180224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:15.460138+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 180224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:16.460316+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 172032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:17.460456+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 172032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:18.460625+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 172032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:19.460839+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 155648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:20.461037+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 155648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:21.461201+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 147456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:22.461356+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 147456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:23.461751+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 147456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:24.461888+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 139264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:25.462034+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 139264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:26.462207+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 139264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:27.462346+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 131072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:28.462512+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 131072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:29.462686+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 131072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:30.462862+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 296.232727051s of 296.242431641s, submitted: 3
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 0 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:31.463122+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 1769472 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:32.463297+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 1744896 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:33.463513+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 1646592 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:34.463704+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 1613824 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:35.463875+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 540672 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:36.464023+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 466944 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:37.464171+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 442368 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:38.464304+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 335872 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:39.464445+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 262144 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:40.464596+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:41.464733+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:42.464879+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:43.465141+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:44.465295+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:45.465513+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:46.465737+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:47.465904+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 237568 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:48.466182+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 237568 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:49.466390+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 229376 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:50.466587+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 229376 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:51.467499+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 221184 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:52.467939+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 221184 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:53.468158+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 221184 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:54.468585+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 212992 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:55.469092+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 212992 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:56.469718+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 204800 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:57.469907+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 204800 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:58.470053+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 196608 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:59.470196+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 180224 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:00.470358+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71073792 unmapped: 172032 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:01.470683+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 163840 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:02.470887+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 163840 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:03.471281+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 147456 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:04.471513+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 147456 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:05.471649+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 139264 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:06.471912+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 139264 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:07.472137+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 139264 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:08.472427+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 131072 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:09.472650+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 131072 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:10.472792+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 131072 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:11.472949+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 106496 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:12.473107+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 106496 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:13.473289+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 98304 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:14.473471+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 98304 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:15.473641+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 90112 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:16.473796+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 90112 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:17.473957+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 90112 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:18.474119+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:19.475366+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:20.475995+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:21.476199+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:22.476413+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:23.476601+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:24.477013+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:25.477204+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:26.477571+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:27.477865+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:28.478121+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:29.478333+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:30.478732+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:31.478908+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:32.479067+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:33.479344+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:34.479558+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:35.479784+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:36.480612+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:37.480800+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:38.480974+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:39.482130+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:40.482275+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:41.482437+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:42.483136+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:43.483383+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:44.483676+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:45.483934+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:46.484246+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 40960 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:47.484557+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 40960 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:48.484868+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 40960 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:49.485122+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 24576 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:50.485338+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 16384 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:51.485509+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:52.485657+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:53.485780+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:54.485906+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:55.486019+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:56.486184+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:57.486337+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:58.486518+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:59.486627+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:00.486780+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:01.486950+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:02.487116+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:03.487306+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:04.487482+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:05.487602+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:06.487729+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:07.487900+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:08.488062+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:09.488242+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:10.488432+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:11.488607+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:12.488908+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:13.489177+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:14.489364+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:15.489559+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:16.489685+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:17.490076+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:18.490234+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:19.490402+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:20.490852+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:21.491032+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:22.491172+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:23.491407+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:24.491614+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:25.492974+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:26.493394+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:27.493785+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:28.494031+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:29.494549+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:30.495355+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:31.495895+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:32.496025+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:33.496443+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:34.496586+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:35.496794+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:36.497041+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:37.497315+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:38.497649+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:39.497911+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:40.498104+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:41.498280+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:42.498547+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:43.498801+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:44.499199+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:45.499470+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:46.499592+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:47.499823+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:48.500097+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:49.500700+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:50.501250+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:51.501585+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:52.501900+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:53.502141+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:54.502409+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:55.502625+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:56.502877+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:57.503108+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:58.503977+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:59.504158+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:00.505006+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:01.505140+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:02.505900+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:03.506666+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:04.507304+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:05.507904+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:06.508414+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:07.508654+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:08.508857+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:09.509212+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:10.509597+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:11.509915+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:12.510080+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:13.510247+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:14.510419+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:15.510612+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:16.510913+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:17.511477+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:18.511978+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:19.512266+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 966656 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:20.512444+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:21.512648+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:22.512775+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:23.512930+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:24.513057+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:25.513189+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:26.513311+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:27.513434+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:28.513571+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:29.513717+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:30.513880+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:31.513996+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:32.514114+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:33.515283+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:34.515748+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:35.515878+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:36.515992+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:37.516142+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:38.516302+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:39.516487+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:40.516651+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:41.533916+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:42.534174+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:43.534409+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:44.534603+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:45.534794+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:46.534992+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:47.535117+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:48.535257+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:49.535405+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:50.535571+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:51.535793+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:52.536027+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:53.536418+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:54.536611+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:55.536792+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:56.536978+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:57.537115+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:58.537297+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:59.537572+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:00.537751+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:01.537942+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:02.538201+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:03.538537+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:04.538777+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:05.539053+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:06.539307+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:07.539502+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:08.539711+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:09.539894+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:10.540145+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:11.540346+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:12.540543+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:13.540850+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:14.541066+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:15.541315+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:16.541601+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:17.541845+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:18.542155+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:19.542340+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:20.542539+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:21.542748+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:22.542926+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:23.543107+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:24.543247+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:25.543377+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:26.543614+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:27.543881+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:28.544052+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:29.544258+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:30.544458+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:31.544652+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:32.544843+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:33.545058+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:34.545272+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:35.545522+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:36.545722+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:37.545987+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:38.546258+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:39.546508+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:40.546851+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:41.547170+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:42.547533+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:43.547961+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:44.548302+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:45.548613+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:46.548909+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:47.549052+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:48.549269+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:49.549435+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:50.549645+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:51.549852+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:52.550064+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:53.550316+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:54.550590+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:55.550853+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:56.551119+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:57.551263+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:58.551417+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:59.551650+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 819200 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:00.551790+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:01.552048+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:02.552279+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:03.552533+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:04.552724+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:05.552870+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:06.553015+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:07.553160+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:08.553313+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:09.553479+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:10.553637+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:11.553778+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:12.553928+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:13.554101+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:14.554204+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:15.554367+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:16.554559+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:17.554719+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:18.554890+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:19.555073+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:20.555211+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:21.555317+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:22.555458+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:23.555674+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:24.555894+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:25.556161+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:26.556386+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:27.556621+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:28.556870+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:29.557063+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:30.557244+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:31.557397+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:32.557611+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:33.557791+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:34.557944+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:35.558097+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:36.558230+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:37.558622+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:38.558882+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:39.559060+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:40.559244+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:41.559433+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:42.559627+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:43.559893+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:44.560127+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:45.560384+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:46.560544+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:47.560664+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:48.560791+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:49.560985+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:50.561142+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:51.561329+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:52.561492+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:53.561688+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:54.561860+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 737280 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:55.562020+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 737280 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:56.562235+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 737280 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:57.562389+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 729088 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:58.562542+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 729088 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:59.562691+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:00.562883+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:01.563018+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:02.563158+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:03.563307+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:04.563445+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:05.563653+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:06.563858+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:07.564349+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:08.564470+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:09.564608+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:10.564783+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:11.564960+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:12.565121+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:13.565343+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:14.565490+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:15.565609+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:16.565767+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:17.565894+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:18.566072+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:19.566242+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:20.566374+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:21.566506+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:22.566751+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:23.567013+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:24.567229+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:25.567385+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:26.567578+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:27.567734+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:28.567878+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:29.568004+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:30.568133+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:31.568277+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:32.568381+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:33.568546+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:34.568717+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:35.568848+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:36.568960+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:37.569087+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:38.569208+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:39.569379+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:40.569594+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:41.569722+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:42.569855+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:43.570002+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:44.570157+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:45.570301+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:46.570444+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:47.570588+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:48.570715+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:49.570877+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:50.570991+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:51.571130+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:52.571246+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:53.571387+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:54.571533+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:55.571649+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:56.571857+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:57.572010+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:58.572188+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:59.572340+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:00.572551+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:01.572668+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:02.572827+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:03.573030+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:04.573202+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:05.573343+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:06.573513+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:07.573643+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:08.573786+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:09.574011+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:10.574143+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:11.574309+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:12.574453+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:13.574650+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:14.574792+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:15.574935+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:16.575070+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:17.575231+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:18.575355+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:19.575515+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:20.575702+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:21.575906+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:22.576051+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:23.576171+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:24.576349+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:25.576548+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:26.576715+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:27.576886+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:28.577091+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:29.577238+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:30.577433+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:31.577680+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:32.577845+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:33.578099+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:34.578266+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:35.578409+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:36.578562+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:37.578777+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:38.578948+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:39.579086+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:40.579261+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:41.579399+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:42.579593+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:43.579856+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:44.579987+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:45.580155+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:46.580303+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:47.580430+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:48.580611+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:49.580763+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:50.580960+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:51.581084+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:52.581217+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:53.581347+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:54.581492+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:55.581614+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:56.581923+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:57.582067+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:58.582204+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:59.582332+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:00.582537+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:01.582674+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:02.582798+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:03.582974+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:04.583093+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:05.583282+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:06.583419+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:07.583560+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:08.583705+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:09.583874+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:10.584021+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:11.584188+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:12.584322+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:13.584492+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:14.584656+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:15.584821+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:16.584970+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:17.585091+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:18.585363+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:19.585613+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:20.585749+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:21.586436+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:22.587015+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:23.587183+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:24.587364+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:25.587618+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:26.587840+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:27.588075+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:28.588204+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:29.588388+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:30.588552+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:31.588709+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:32.588865+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:33.589018+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:34.589240+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:35.589369+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:36.589543+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:37.589673+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:38.589888+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:39.590150+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:40.590314+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:41.590516+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:42.590642+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:43.590869+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:44.591070+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:45.591206+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:46.591338+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:47.591451+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:48.591752+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:49.591880+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:50.592048+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:51.592269+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:52.592549+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:53.592764+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:54.592940+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:55.593123+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:56.593320+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:57.593502+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:58.593628+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:59.593754+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 491520 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:00.593911+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:01.594068+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:02.594253+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:03.594451+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:04.594630+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:05.594751+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:06.594871+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:07.595029+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:08.595224+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:09.595507+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:10.595661+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:11.595853+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:12.596030+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:13.596260+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:14.596396+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:15.596530+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:16.596699+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:17.596876+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 5796 writes, 24K keys, 5796 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5796 writes, 923 syncs, 6.28 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 443 writes, 694 keys, 443 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s
                                           Interval WAL: 443 writes, 211 syncs, 2.10 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc774b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:18.597013+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:19.597155+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:20.597358+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:21.597549+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:22.597710+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:23.597883+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:24.598081+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:25.598259+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:26.598421+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:27.598565+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:28.598742+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:29.598895+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:30.599043+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:31.599168+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:32.599305+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:33.599507+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:34.599656+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:35.599791+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:36.600092+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:37.600317+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:38.600471+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:39.600644+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:40.600892+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:41.601030+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:42.601226+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:43.601466+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:44.601676+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:45.601897+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:46.602059+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:47.602225+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:48.602368+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:49.602526+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:50.602696+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:51.602883+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:52.603033+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:53.603221+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:54.603354+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:55.603486+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:56.603631+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:57.603865+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:58.604015+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 434176 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:59.604149+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:00.604282+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:01.604397+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:02.604557+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:03.604703+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:04.604837+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:05.604969+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:06.605270+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:07.605407+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:08.605522+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:09.605662+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:10.605905+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:11.606043+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:12.606251+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:13.606460+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:14.606596+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:15.606769+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:16.607066+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:17.751196+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:18.751360+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:19.751475+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:20.751607+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:21.751748+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:22.751886+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:23.752048+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:24.752183+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:25.752308+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:26.811926+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:27.812655+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:28.812961+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 401408 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:29.813098+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 401408 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:30.813946+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 599.242004395s of 600.520812988s, submitted: 232
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 262144 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:31.814394+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1081344 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:32.814605+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:33.814794+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:34.815018+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:35.815437+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:36.815572+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:37.815894+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:38.816076+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:39.816289+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:40.816419+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:41.816715+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:42.816882+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:43.817053+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:44.817230+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:45.817390+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:46.817535+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:47.817698+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:48.817842+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:49.817954+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:50.818132+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:51.818312+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:52.818459+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:53.818610+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:54.818752+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:55.818880+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:56.820201+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:57.820344+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:58.820478+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:59.820656+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:00.820844+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:01.821309+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:02.821462+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:03.822139+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:04.822629+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:05.822920+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:06.823302+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:07.823424+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:08.823546+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:09.823721+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:10.823839+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:11.823959+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:12.824892+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:13.825046+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:14.825250+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:15.825797+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:16.825938+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:17.826138+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:18.826312+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:19.826464+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:20.826571+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:21.826873+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:22.826999+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:23.827177+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:24.827314+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:25.827613+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:26.827775+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:27.827915+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:28.828034+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:29.828188+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:30.828401+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:31.828780+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:32.828942+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:33.829098+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:34.829291+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:35.829547+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:36.829723+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:37.829917+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:38.830035+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:39.830228+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:40.830496+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:41.830751+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:42.830962+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:43.831182+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:44.831383+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:45.831491+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:46.831725+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:47.831955+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:48.832091+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:49.832224+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:50.832367+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:51.832499+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:52.832684+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:53.832852+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:54.832976+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:55.833187+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:56.833326+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:57.833471+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:58.833640+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:59.833780+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:00.833997+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:01.834156+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:02.834381+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:03.834545+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:04.834739+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:05.834890+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:06.835017+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:07.835292+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:08.835435+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:09.835582+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:10.835686+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:11.835844+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:12.835966+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:13.836095+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:14.841363+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:15.841483+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:16.841606+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:17.841761+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:18.841945+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:19.842099+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:20.842222+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:21.842475+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:22.842656+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:23.842866+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:24.842996+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:25.843197+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:26.843349+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:27.843482+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:28.843678+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:29.843871+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:30.844080+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:31.844217+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:32.844323+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:33.844471+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:34.844635+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:35.844776+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:36.844871+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:37.845304+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:38.845661+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:39.846783+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:40.847491+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:41.847940+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:42.848078+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:43.848359+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:44.849076+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:45.849596+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:46.849800+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:47.850068+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:48.850290+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:49.850518+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:50.850661+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:51.850852+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:52.850983+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:53.851138+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:54.851275+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:55.851617+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:56.851875+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:57.851996+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:58.852204+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:59.852417+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:00.852578+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:01.852717+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:02.852892+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:03.853076+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:04.853243+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:05.853384+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:06.853571+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:07.853713+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:08.853919+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:09.854043+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:10.854217+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:11.854375+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:12.854537+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:13.854711+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:14.854860+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:15.855067+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:16.855385+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:17.855548+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:18.855691+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:19.855888+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:20.856041+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:21.856416+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:22.856714+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:23.856873+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:24.857049+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:25.857183+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:26.857502+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:27.857798+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:28.857950+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:29.858233+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:30.858422+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:31.858945+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:32.859223+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:33.859473+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:34.860081+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:35.860462+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:36.860867+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:37.861251+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:38.861724+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:39.862063+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:40.862320+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:41.862695+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:42.862968+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:43.863215+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:44.863446+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:45.863665+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:46.863852+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:47.864051+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:48.864199+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:49.864388+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:50.864586+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:51.864762+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:52.864963+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:53.865134+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:54.865370+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:55.865563+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:56.865703+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:57.865887+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:58.866041+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:59.866171+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:00.866348+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:01.866495+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:02.866788+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:03.867660+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:04.867879+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:05.868053+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:06.868325+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:07.868526+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:08.868745+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:09.868996+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:10.869186+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:11.869426+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:12.869651+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:13.869844+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:14.869970+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:15.870152+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:16.870326+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:17.870452+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:18.870608+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:19.870736+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:20.870927+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:21.871120+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:22.871251+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:23.871409+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:24.871560+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:25.871709+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:26.871893+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:27.872078+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:28.872291+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:29.872485+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:30.872614+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:31.872851+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:32.872993+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:33.873172+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:34.873432+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:35.873598+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:36.873842+0000)
Nov 29 06:58:51 compute-2 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:37.874058+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:38.874232+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:39.874391+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:40.874554+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:41.874701+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:42.874899+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:43.875087+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:44.875228+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:45.875419+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:46.875586+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:47.875769+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:48.875987+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:49.876154+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:50.876306+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:51.876503+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:52.876659+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:53.876929+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:54.877124+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:55.877279+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:56.877467+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:57.877592+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:58.877722+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:59.877901+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:00.878083+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:01.878237+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:02.878409+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:03.878611+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:04.878861+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:05.879039+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:06.879222+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:07.879385+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:08.879559+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:09.879869+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:10.880098+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:11.880293+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:12.880445+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:13.880629+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:14.880862+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:15.881135+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:16.881403+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:17.881636+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:18.881865+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:19.882156+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:20.882339+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:21.882548+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:22.882781+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:23.883163+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:24.883431+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:25.883647+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:26.883883+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:27.884088+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:28.884291+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:29.884434+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:30.884623+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:31.884847+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:32.885043+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:33.885285+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:34.885508+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:35.885774+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:36.885980+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:37.886196+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:38.886422+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:39.886600+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:40.886861+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:41.887084+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:42.887225+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:43.887401+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:44.887624+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:45.887859+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:46.888091+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:47.888281+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:48.888426+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:49.888606+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:50.888868+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:51.889017+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:52.889156+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:53.889315+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:54.889476+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:55.889634+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:56.889791+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:57.889959+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:58.890113+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:59.890271+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:00.890630+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:01.890799+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:02.890975+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:03.891158+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:04.891329+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:05.891487+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:06.891684+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:07.891852+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:08.892055+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:09.892227+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:10.892449+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:11.892645+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:12.892769+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:13.892963+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:14.893131+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:15.893277+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:16.893443+0000)
Nov 29 06:58:51 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:17.893631+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:18.893787+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:19.894026+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72654848 unmapped: 688128 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:20.894178+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72654848 unmapped: 688128 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:21.894332+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:22.896352+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:23.897772+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:24.898098+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:25.900124+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:26.901912+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:27.903295+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:28.903497+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:29.904080+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:30.904249+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:31.904376+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:32.904779+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:33.905029+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:34.905204+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:35.905335+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:36.905588+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:37.905777+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:38.906053+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:39.906193+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:40.906428+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:41.906574+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:42.906720+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:43.906907+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:44.907079+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:45.907229+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:46.907443+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:47.907642+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:48.907898+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:49.908050+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:50.908267+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:51.908484+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:52.908713+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:53.908887+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:54.909019+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:55.909139+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:56.909291+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:57.909404+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:58.909540+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:59.909674+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:00.909901+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:01.910053+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:02.910181+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:03.910320+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:04.910483+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:05.910603+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:06.910763+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:07.910926+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:08.911056+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:09.911238+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:10.911360+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:11.911516+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:12.911653+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:13.911836+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:14.911981+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:15.912136+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:16.912267+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:17.912391+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:18.912568+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:19.912707+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:20.912924+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:21.913073+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:22.913255+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:23.913476+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:24.913670+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:25.913830+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:26.914245+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:27.914577+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:28.914849+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:29.915063+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:30.915374+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:31.915625+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:32.915871+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:33.916068+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:34.916240+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:35.916472+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:36.917090+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:37.917265+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:38.917416+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:39.917542+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:40.917702+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:41.917872+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:42.917951+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:43.918171+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:44.918409+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:45.918551+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:46.918779+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:47.918902+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:48.919155+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:49.919296+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:50.919582+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:51.919720+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:52.919851+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:53.919986+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:54.920143+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:55.920331+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:56.920509+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:57.920718+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:58.920857+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:59.921054+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:00.921251+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:01.921447+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:02.921669+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:03.921884+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:04.922035+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:05.922174+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:06.922382+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:07.922548+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:08.922727+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:09.922899+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:10.923138+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:11.923282+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:12.923419+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:13.923568+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:14.923742+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:15.923896+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:16.924062+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:17.924256+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:18.924441+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:19.924603+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:20.924762+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:21.924915+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:22.925156+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:23.925370+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:24.925601+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:25.925779+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:26.925935+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:27.926095+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:28.926246+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:29.926410+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:30.926732+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:31.926906+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:32.927057+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:33.927223+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:34.927390+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:35.927550+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:36.927717+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:37.927898+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:38.928088+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:39.928255+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:40.928422+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:41.928597+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:42.928778+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:43.929005+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:44.929231+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:45.929444+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:46.929644+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:47.929846+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:48.930036+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:49.930304+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:50.930515+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:51.930701+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:52.930887+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:53.931146+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:54.931340+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:55.931645+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:56.931870+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:57.932062+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:58.932250+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:59.932421+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:00.932601+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:01.932796+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:02.933040+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:03.933305+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:04.933491+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:05.933688+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:06.933852+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:07.934082+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:08.934285+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:09.934393+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:10.934534+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:11.934673+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:12.934875+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:13.935098+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:14.935402+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:15.935549+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:16.935750+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:17.935906+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 6280 writes, 25K keys, 6280 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6280 writes, 1161 syncs, 5.41 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 484 writes, 738 keys, 484 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s
                                           Interval WAL: 484 writes, 238 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:18.936072+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:19.936233+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:20.936420+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:21.936566+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:22.936733+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:23.936915+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:24.937088+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:25.937242+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:26.937388+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:27.937551+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:28.937767+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:29.937929+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:30.938113+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:31.938275+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:32.938464+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:33.938656+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:34.939014+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:35.939145+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:36.939299+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:37.939427+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:38.939587+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:39.939723+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:40.939858+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:41.939959+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:42.940191+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:43.940354+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:44.940693+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:45.940851+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:46.940999+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:47.941074+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: mgrc ms_handle_reset ms_handle_reset con 0x55fed1842000
Nov 29 06:58:52 compute-2 ceph-osd[79822]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1221624088
Nov 29 06:58:52 compute-2 ceph-osd[79822]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1221624088,v1:192.168.122.100:6801/1221624088]
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: get_auth_request con 0x55fed3114000 auth_method 0
Nov 29 06:58:52 compute-2 ceph-osd[79822]: mgrc handle_mgr_configure stats_period=5
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:48.941204+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:49.941311+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:50.941459+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:51.941600+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:52.941756+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:53.941905+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:54.942041+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:55.942183+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:56.942317+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:57.942464+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:58.942604+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:59.942744+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:00.942925+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:01.943403+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:02.943547+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:03.943732+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:04.943897+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:05.944028+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:06.944186+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:07.944399+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:08.944532+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:09.944673+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:10.944878+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:11.945069+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:12.945238+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:13.945453+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:14.945605+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:15.945770+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:16.946030+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:17.946192+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:18.946350+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:19.946553+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:20.946726+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:21.946872+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:22.947076+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:23.947269+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:24.947432+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:25.947600+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:26.947794+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:27.947981+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:28.948085+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:29.948224+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:30.948425+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 599.588073730s of 600.260498047s, submitted: 246
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:31.948624+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72933376 unmapped: 409600 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [0,0,0,0,0,0,3])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:32.948851+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72957952 unmapped: 385024 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:33.949095+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72974336 unmapped: 368640 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [1,0,1,1])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:34.949262+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 229376 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:35.949426+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:36.949646+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:37.949772+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:38.949970+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:39.950104+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:40.950290+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:41.950430+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:42.950615+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:43.950768+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:44.951511+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:45.951650+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:46.951977+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:47.952109+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:48.952384+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:49.952528+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:50.952656+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:51.952837+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:52.953208+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:53.953385+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:54.953673+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:55.953882+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:56.954390+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:57.954549+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:58.954703+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:59.954867+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:00.955056+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:01.955205+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:02.955349+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:03.955534+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:04.955672+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:05.955832+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:06.955952+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:07.956085+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:08.956204+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:09.956365+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:10.956508+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:11.956680+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:12.956831+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:13.957033+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:14.957258+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:15.957382+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:16.957570+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:17.957720+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:18.957937+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:19.958100+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:20.958325+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:21.958468+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:22.958704+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:23.958918+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:24.959106+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:25.959241+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:26.959422+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:27.959560+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:28.965631+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:29.965765+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:30.965926+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:31.966074+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:32.966224+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:33.966394+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:34.966545+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:35.966691+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:36.966855+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:37.966989+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:38.967164+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:39.967316+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:40.967472+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:41.967605+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:42.968037+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:43.968231+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:44.968391+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:45.968548+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:46.968713+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:47.968877+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:48.969069+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:49.969262+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:50.969415+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:51.969553+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:52.969739+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:53.970010+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:54.970155+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:55.970292+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:56.970521+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:57.970674+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:58.970838+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:59.971011+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:00.971148+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:01.971284+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:02.971430+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:03.971577+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:04.971713+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:05.971864+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:06.971974+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:07.972089+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:08.972240+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:09.972389+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:10.972586+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:11.972738+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:12.972901+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:13.973122+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:14.973284+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:15.973428+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:16.973679+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:17.973834+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:18.973958+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:19.974103+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:20.974284+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:21.974452+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:22.974622+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:23.974825+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:24.975020+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:25.975156+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:26.975305+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:27.975439+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:28.975622+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:29.975847+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:30.976021+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:31.976191+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:32.976392+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:33.976563+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:34.976688+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:35.976858+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:36.977053+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:37.977233+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:38.977442+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:39.977666+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:40.977841+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:41.977988+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:42.978128+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:43.978296+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:44.978454+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:45.978625+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:46.978777+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:47.978941+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:48.979132+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:49.979278+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:50.979462+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:51.979638+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:52.979779+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:53.979963+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:54.980116+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:55.980299+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:56.980508+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:57.980669+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:58.980899+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:59.981110+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:00.981299+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:01.981494+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:02.981715+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:03.981879+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:04.982128+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:05.982296+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:06.982454+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:07.982577+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:08.982698+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:09.982854+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:10.982988+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:11.983168+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:12.983375+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:13.983666+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:14.983863+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:15.983989+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:16.984107+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:17.984274+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:18.984446+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:19.984615+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:20.984749+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:21.984923+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:22.985058+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:23.985227+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:24.985408+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:25.985537+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:26.985710+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:27.985858+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:28.986003+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:29.986153+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:30.986322+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:31.986457+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:32.986644+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:33.986829+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:34.987114+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:35.987335+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:36.987515+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:37.987672+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:38.988091+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:39.988352+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:40.988527+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:41.988724+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:42.988923+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:43.989221+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:44.989370+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:45.989533+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:46.989717+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:47.989899+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:48.990047+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:49.990193+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:50.990464+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:51.990646+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:52.990775+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:53.990979+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:54.991191+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:55.991364+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:56.991500+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:57.991700+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:58.991881+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:59.992073+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:00.992239+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:01.992434+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:02.992594+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:03.992847+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:04.992998+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:05.993144+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:06.993319+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:07.993666+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:08.993843+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:09.993982+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:10.994186+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:11.994332+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:12.994713+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:13.994980+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:14.995206+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:15.995349+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:16.995624+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:17.995889+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:18.996060+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:19.996221+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:20.996378+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:21.996573+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:22.996896+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:23.997123+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:24.997265+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:25.997448+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:26.997646+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:27.997837+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:28.998100+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:29.998323+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:30.998555+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:31.998784+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:32.999003+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:33.999279+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:34.999460+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:35.999634+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:36.999875+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:38.000055+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:39.000275+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:40.000477+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:41.000683+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:42.000890+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:43.001125+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:44.001363+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:45.001569+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:46.001868+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:47.002096+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:48.002221+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:49.002439+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:50.002723+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:51.002908+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:52.003061+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:53.003266+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:54.003551+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:55.003799+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:56.004088+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:57.004266+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:58.004443+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:59.004623+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:00.004788+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:01.004975+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:02.005202+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:03.005422+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:04.005684+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:05.005918+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:06.006109+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:07.006264+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:08.006461+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:09.006667+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:10.006852+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:11.007060+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:12.007262+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:13.007414+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:14.007620+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:15.007857+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:16.008074+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:17.008295+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:18.008446+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:19.008581+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:20.008778+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:21.008919+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:22.009164+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:23.009365+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:24.009633+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:25.009938+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:26.010130+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:27.010351+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:28.010523+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:29.010673+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:30.010873+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:31.011009+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:32.011201+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:33.011323+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:34.011532+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:35.011655+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:36.011871+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:37.012081+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:38.012274+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:39.012509+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:40.012698+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:41.012857+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:42.012972+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:43.013089+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:44.013272+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:45.013486+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:46.013664+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:47.013865+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:48.014033+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:49.014397+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:50.014552+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:51.014679+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:52.014880+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:53.015078+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:54.015277+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:55.015499+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:56.015698+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:57.015927+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:58.016186+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:59.016388+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:00.016658+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:01.016893+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:02.017066+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:03.017286+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:04.017513+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:05.017732+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:06.017948+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:07.018089+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:08.018301+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:09.018451+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:10.018576+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:11.018707+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:12.018897+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:13.019074+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:14.019310+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:15.019447+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:16.019633+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:17.019787+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:18.020200+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:19.020340+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:20.020532+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:21.020716+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:22.020920+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:23.021132+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:24.021340+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:25.021472+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:26.021658+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:27.021879+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:28.022026+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:29.022167+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:30.022342+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:31.022488+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:32.022656+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:33.022877+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:34.023074+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:35.023281+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:36.023421+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:37.023623+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:38.024042+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:39.024205+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:40.024408+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:41.024640+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:42.024876+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:43.025064+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:44.025238+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:45.025392+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:46.025581+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:47.025794+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:48.026067+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:49.026251+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:50.026425+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:51.026627+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:52.026777+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:53.026969+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:54.027162+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:55.027317+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:56.027549+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:57.027734+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:58.027897+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 0 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:59.028083+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 0 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:00.028257+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:01.028479+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:02.028617+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:03.028892+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:04.029106+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:05.029274+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:06.029575+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:07.030327+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:08.030895+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:09.031311+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:10.031640+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:11.031936+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:12.032115+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:13.032268+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:14.032385+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:15.032888+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:16.033309+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:17.033685+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:18.033988+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:19.034236+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:20.034451+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:21.034636+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:22.034885+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:23.035045+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:24.035261+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:25.035441+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:26.035608+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:27.035777+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:28.035906+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:29.036194+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:30.036360+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:31.036516+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:32.036741+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:33.036934+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:34.037121+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:35.037275+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:36.037469+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:37.037698+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:38.037866+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:39.038031+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:40.038203+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:41.039403+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:42.039562+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:43.039713+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:44.039939+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:45.040135+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:46.040267+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:47.040464+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:48.040652+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:49.040878+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:50.040995+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:51.041144+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:52.041314+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:53.041481+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:54.041680+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:55.041916+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:56.042060+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:57.042275+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:58.042528+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:59.042741+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:00.042960+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:01.043092+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:02.043264+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:03.043429+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:04.043624+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:05.043834+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:06.043954+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:07.044131+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:08.044295+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:09.044460+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:10.044578+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:11.044715+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:12.044863+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:13.045002+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:14.045159+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:15.045305+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:16.045444+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:17.045605+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:52 compute-2 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:52 compute-2 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:18.045776+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 958464 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:19.045938+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 761856 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'config diff' '{prefix=config diff}'
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'config show' '{prefix=config show}'
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'counter dump' '{prefix=counter dump}'
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 29 06:58:52 compute-2 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'counter schema' '{prefix=counter schema}'
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:20.046100+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74924032 unmapped: 1564672 heap: 76488704 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: tick
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_tickets
Nov 29 06:58:52 compute-2 ceph-osd[79822]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:21.046243+0000)
Nov 29 06:58:52 compute-2 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 75235328 unmapped: 1253376 heap: 76488704 old mem: 2845415832 new mem: 2845415832
Nov 29 06:58:52 compute-2 ceph-osd[79822]: do_command 'log dump' '{prefix=log dump}'
Nov 29 06:58:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:53.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:53 compute-2 podman[243336]: 2025-11-29 06:58:53.390706896 +0000 UTC m=+0.080409318 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:58:53 compute-2 podman[243334]: 2025-11-29 06:58:53.415723523 +0000 UTC m=+0.107004878 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 06:58:53 compute-2 podman[243329]: 2025-11-29 06:58:53.426207267 +0000 UTC m=+0.118027886 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 06:58:53 compute-2 crontab[243452]: (root) LIST (root)
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/258398947' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/532443902' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/4111572753' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2584480393' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3758656299' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/844408131' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.24970 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.24893 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1662226395' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/4145305371' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:53 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:53 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:53 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:53.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Nov 29 06:58:53 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1383921042' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:58:53 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Nov 29 06:58:53 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2433791124' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:58:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 06:58:54 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1819404255' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:54 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 06:58:54 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3266793758' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:55.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:55 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:55 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:55 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:55.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:55 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 06:58:55 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/800220038' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:56 compute-2 systemd[1]: Starting Hostname Service...
Nov 29 06:58:56 compute-2 systemd[1]: Started Hostname Service.
Nov 29 06:58:57 compute-2 sudo[243846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:57 compute-2 sudo[243846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-2 sudo[243846]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3995147939' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2877614875' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2517383288' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1377123572' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3003957364' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1120183002' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: pgmap v1392: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:57 compute-2 ceph-mon[77142]: pgmap v1393: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1383921042' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2433791124' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2594614874' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1233692885' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3946325939' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3661981742' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:58:57 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3783031133' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:57 compute-2 sudo[243873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:58:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:57.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:57 compute-2 sudo[243873]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-2 sudo[243873]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:57 compute-2 sudo[243902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:57 compute-2 sudo[243902]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-2 sudo[243902]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:57 compute-2 sudo[243930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:58:57 compute-2 sudo[243930]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:57 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:57 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:57.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:57 compute-2 sudo[243930]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:58 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Nov 29 06:58:58 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2561335456' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 06:58:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:59.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:59 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:58:59 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:59 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:59.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:59 compute-2 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 06:58:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 06:58:59 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2442087058' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:59 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Nov 29 06:58:59 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4281976021' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 29 06:59:00 compute-2 sudo[244324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:59:00 compute-2 sudo[244324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:59:00 compute-2 sudo[244324]: pam_unix(sudo:session): session closed for user root
Nov 29 06:59:00 compute-2 sudo[244364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:59:00 compute-2 sudo[244364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:59:00 compute-2 sudo[244364]: pam_unix(sudo:session): session closed for user root
Nov 29 06:59:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Nov 29 06:59:00 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4203124747' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Nov 29 06:59:00 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 06:59:00 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2772849280' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:59:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:59:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:59:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:01.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.15078 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.15093 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/1819404255' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.24944 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3635354810' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.25021 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1423480369' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/3266793758' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.24950 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3429457872' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.25033 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/634521314' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/4117577449' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.24962 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: pgmap v1394: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.25039 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/363659733' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.24968 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.25057 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.15147 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.15153 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.24983 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.25066 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/4186939961' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.15168 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3606582697' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:59:01 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:59:01 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:59:01 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:59:01.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Nov 29 06:59:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1430961509' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Nov 29 06:59:02 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Nov 29 06:59:02 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3531646775' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 29 06:59:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Nov 29 06:59:03 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2078930364' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Nov 29 06:59:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:59:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:59:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:03.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Nov 29 06:59:03 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/55082743' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Nov 29 06:59:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Nov 29 06:59:03 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2313769865' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Nov 29 06:59:03 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:59:03 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:59:03 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:59:03.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:59:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Nov 29 06:59:03 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4055116638' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Nov 29 06:59:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Nov 29 06:59:03 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/172091942' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Nov 29 06:59:03 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Nov 29 06:59:03 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3473434345' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Nov 29 06:59:04 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3044107192' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/1183724430' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/800220038' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.24989 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25075 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25081 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25087 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.24998 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.15186 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: pgmap v1395: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25096 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25010 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25105 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25016 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25117 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.25028 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2561335456' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1523353195' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: pgmap v1396: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.15207 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/1840177804' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/3562221273' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.101:0/2432053198' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.15219 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/2442087058' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.102:0/4281976021' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/95145201' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.15231 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/2283979250' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/871154065' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:59:04 compute-2 ceph-mon[77142]: from='client.? 192.168.122.100:0/3338702776' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 29 06:59:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Nov 29 06:59:05 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3986255285' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Nov 29 06:59:05 compute-2 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 06:59:05 compute-2 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:59:05 compute-2 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:05.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:59:05 compute-2 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Nov 29 06:59:05 compute-2 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1013829368' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
